I believe in ‘life-long-learning’. With this I continue to learn and discover new things every day. I’m writing tutorials to give something back to the community from which I have learned so much.
On top of this, I receive emails on a nearly daily basis, asking for help. Many articles have the origin in such requests or questions. I prefer questions or comments in a public forum, because that way I feel all others can benefit from it. Last week Alessandro contacted me with this:
I hope this find you well! I’m starting to using ARM processors, but I find them quite complicated on the configuration side. I started in the past with PIC micro (PIC16) with asm, and I found them quite straightforward to be configured (clock, IO, peripherals, …). Then I moved myself on C language, and on PIC18 without any big issues.
Now I would really like join the ARM community, I see that these processors are what I’ve always looking for, on energy, calc power, peripherals, and FINALLY on IDE (editor, toolchain and utilities)… AMAZING!!!”
The topic is about how to start learning developing for ARM. Alessandro agreed to make this public, so I thought this might be a good topic for an article?
Long time ago…
I started assembly programming on a MOS Technology 6510, Motorola 68000 and then writing my first C compiler for the Hitachi H8/500 and Motorola HC11, and many other architectures followed. And as Alessandro, I have been pulled into the amazing and powerful world of more advanced processors as they came out over time. I have been pulled into the fascinating world of IDEs like Eclipse. I had the pleasure to work on many microcontroller and processors, but in retrospect these early processors and microcontroller are my favorite. They were rather easy to configure and use compared to the modern ARM Cortex microcontroller. And that’s the point of Alessandro too:
“Here my point: An ARM Cortex-M3/M4 microcontroller has a datasheet that is roughly 1000 pages long.
I tried to understand from this and the example code, “shipped” with the board, the “configuration phase”, but unfortunately often there is a lack of information that can’t make me understand exactly why the manufacturer wrote that line of code, instead of this…
Many things are taken for granted… Nobody taught me on that ;(“
So it is not only me thinking that there is a problem. Maybe all the ones who grow up like me with the technology take things as granted. As as the ‘older’ ones know a lot of the legacy, it is easy to forget that not everything is obvious. It is easy to I know ‘too much’ and to forget about how much experience and knowledge it takes to understand and use these modern architectures. As for myself, I have realized that my early tutorials were much more basic than the most recent ones. I’m getting too advanced?
As for the ‘1000 pages long data sheet’: Alone the ARM Cortex-M information is several thousands of pages. ARM microcontroller vendors like STM, TI, NXP, etc are integrating the ARM IP (core) and adding their own special peripherals, so why explaining how the ARM IP is interlocked? Go look at the ARM web site! But not everyone is knows already how the NVIC (see my articles series) works, but it is essential to use the device. It makes sense that this information is not copied over and over, but for someone starting with the device it is not clear and a difficult start. For example the NXP K64 Sub-Family Reference Manual has 1789 pages and does *not* include anything about the ARM Cortex itself: it is only about the peripherals (UART, SPI, etc), and all the other important things are behind a ‘yellow’ box with some references to NVIC, etc:
Definitely not an easy starting point for a newcomer.
Of course the number of pages does not say anything about the quality of the documentation. There are interesting comments on that subject in a Hackaday article (http://hackaday.com/2016/10/27/qualcomm-buys-nxp-in-largest-ever-semiconductor-deal/). One of the comments references the M68HC11 reference manual (http://www.nxp.com/assets/documents/data/en/reference-manuals/M68HC11RM.pdf): The reference manual has about 650 very well written pages. It includes everything I need to know about the device. Plus there was this small programming reference guide with package information, instructions encoding and all the peripheral settings (ADC, SCI, Ports, etc):
In contrast to this, below the ARM ‘bible’ written by Joseph Yiu: more than 800 (excellent!) pages. But only about the core itself:
Should I learn Cortex?
Yes, the Cortex-M (ARMv7) has a different complexity than the HC11, and the ARM Cortex-A would be even more complex! So maybe the ARM Cortex-M is even the right starting point to learn a microcontroller?
We are having a similar debate at the university: The university entry microcontroller course is using the Motoroal/Freescale-now-NXP-soon-maybe-Qualcomm HCS08 core. It is a rather simple 8/16bit microcontroller, easy to understand, no pipelines/caches, with straight-forward peripherals, a perfect device to learn a microcontroller. While that S08 is produced in high volume, it is a proprietary core, and more of a thing of the past. Most vendors have now moved on to produce ARM based devices. Very likely the students will end up using an ARM in their workplace or in their projects. So using the HCS08 from a learning and didactic point of view makes a lot of sense. But it does not from a marketplace perspective.
I want to understand it
Alessandro makes more good points:
“In particular, I would be really REALLY glad to understand how to configure (using just code, not others graphical tools -> I want to understand it!):
* The microcontroller clock for CPU, peripherals and others uses… starting from the datasheet pages to the code.
* The I/O for general purpose, and peripherals…. starting from the datasheet pages to the code.
As soon as I understand I will be happy to switch to the plenty of the graphical tools available to do that job!
I know that you are very very busy, but I assume that out there are many people like me, that still have the same problem to switch to ARM microcontroller platform, and we all would be very Very VERY VERY GRATEFUL if you could write a great (as always you do) article on these topic.”
I absolutely agree: I have to learn and understand the basics before going to the high level tools or something like a graphical pin configuration tool:
This is not necessarily only true for graphical configuration tools (which are a great helper!), same for using things like a compiler: without having a understanding how a microcontroller works, how it uses assembly instructions to perform a calculation, using a high level language compiler is like ‘flying in the dark’. Clearly, it does not mean that we have to go back and start doing everything in assembly language, but a good developer should have a good idea what gets executed as a result of his source code compiled and running on the target.
Same thing for all the graphical configuration tools like Processor Expert, STM32Cube or all the configuration and muxing tools: I have to know how things are supposed to work behind the scenes to make good design decisions. And the tools are here to help me, that I don’t have to memorize all the thousands of pages of the data sheets and reference manuals. It needs both: the ‘helper tools’ and a solid reference manual.
However, how the clocking or the peripherals like UART, USB, SPI or I2C are working, this is very vendor specific. There are common concepts like clock gating, prescalers, buffering, interrupts, latching, latency and configuration registers, but the implementation details are vendor and even device specific. That means in the extreme case these kind of things have to be re-learned for every new device. On the plus side: what I have learnded about the ARM Cortex architecture can be re-used.
What about all the examples from the microcontroller vendors? They can be a good starting point. But I find many examples are not well documented, and if there are comments in the sources, they tell me the ‘what’ and not the ‘why’. Or the examples are not doing what I believe they should do: I mean how much sense makes a ‘blinky’ example which does not use the LED on the board?
In the past I have seen good application notes, but then it is challenging to find the source code for it. Or the sources/projects are outdated as the tools used do not exist any more or have been replaced by other and incompatible tools.
I wish there would be more books (see Books section). I like to have a paper version of good information. I like to make notes on the pages or add sticky notes. But maybe I’m alone: I only see few students printing the lab instructions or the scriptum (why?). Publishing books is risky and expensive, although there are new ‘on demand’ models which sound interesting. Still a fundamental problem would be: to write a good book about developing for a processor or board, it gets very fast very board/processor/microcontroller specific. So if it is not for a very popular board (say the Raspberry Pi), probably the market is too fragmented and not worth the effort.
A great starting point is to use one of the many (and inexpensive) evaluation boards:
“In the past I bought the LPCexpresso from embedded artist with the LPC1343, and tried to start with this… But as soon as I searched something on the web about configuration I quickly realized that ST32 is more popular, then probably I should have bought that!!! 😱. Personally I would be quite happy to use the NXP..”
Here there is another challenge: there are so many of these boards, that only a few really get widely used and supported.The silicon vendors are producing boards for nearly every device they produce for ‘evaluation’, but not for ‘development’. Too many, and too different so only few boards get critical mass to be used as community board. The exception to this are for example the Arduino boards which are mostly compatible, or the best example for me are the Raspberry boards. The Raspberry foundation has put a lot of efforts to keep things compatible and that software runs on all boards. Something I don’t see on other boards. And with this, it is much easier to learn and find things for a Raspberry Pi than say for a LPC1343.
Software and Tools
Another aspect for using an architecture is the availability and quality of software development tools:
“One another BIG problem that I had was about the “toolchain”.
When I bought the LPCexpresso it was “shipped” with the CodeRed compiler. Onestly for me has been a big mess understand that there are 3 main compilers: ARM, IAR and GCC. Now it is foregone conclusion, but when I started it wasn’t… I didn’t want be stuck (learn) one compiler which could have been under a payment option.
Second problem was about a good IDE. I wanted to start with the best and scalabile option, but at that time, and probably even now I’m not really aware about all the pros and cons that one has vs another…”
So the good thing in the ARM world is that there are plenty of choices. The bad thing in the ARM world is that there are plenty of changing choices. Several vendors including ARM Inc. are providing multiple (competing) tool chains and IDEs. And with all the mergers and acquisitions things are constantly changing too. The only good thing on the IDE side is that many tool vendors have standardized on Eclipse as an IDE: so once I have learned Eclipse it is very well invested, as I can reuse my knowledge for that IDE from that other vendor. Until a new ‘standard’ will emerge.
Is it more difficult to develop for ARM devices? I think that depends. Personally, I very much like the ARM architecture. But in my opinion it is easier and simpler to develop say for Intel. I don’t like the Microsoft+Intel combo, but they seem do a better job to help the developers with good software and tools. ARM Inc. tries to catch up on this with mbed/CMSIS and other initiatives, but it is still fragmented and inconsistent. I think the strength of the broad ARM ecosystem is its biggest weakness: resulting in fragmentation, inconsistency and constant changes.
On the other side: because so many vendors are using an ARM architecture in their devices, once I have learned the architecture, it is easier for the next device. I still have to learn the different peripherals, but at least the core remains (mostly) the same. Similar like Eclipse as IDE: most vendors are using it, and if I have learned it once, that learning time is well invested.
Silicon is getting more and more complex, and with this the software and tools. To me, hardware and silicon are important, but even more important are the software and tools.
What counts at the end is how fast I can finish my project/product with unique functionality. And if I can reduce my learning time, the better. With all the mergers and acquisitions going on, hard to say where we end up in 5-10 years. I feel the success of an architecture depends how easy it is to get new users like Alessandro up to speed.
I know that the topic discussed here is controversal, and up to different options. But I would like to thank Alessandro for bringing up his thoughts. And that he wants to be on that learning journey, and he brought up one of my favorite quotes:
“I hear and I forget. I see and I remember. I do and I understand.” – Confucius
Happy Learning 🙂
PS: I guess I should start writing a series of ‘basic’ tutorials as I wrote them 4 years ago? I wish I can find some time.
PS2: And I would like to thank you all for your comments, I have learned a lot that way. So keep the comments coming 🙂