I believe in ‘life-long-learning’. With this I continue to learn and discover new things every day. I’m writing tutorials to give something back to the community from which I have learned so much.
On top of this, I receive emails on a nearly daily basis, asking for help. Many articles have the origin in such requests or questions. I prefer questions or comments in a public forum, because that way I feel all others can benefit from it. Last week Alessandro contacted me with this:
I hope this find you well! I’m starting to using ARM processors, but I find them quite complicated on the configuration side. I started in the past with PIC micro (PIC16) with asm, and I found them quite straightforward to be configured (clock, IO, peripherals, …). Then I moved myself on C language, and on PIC18 without any big issues.
Now I would really like join the ARM community, I see that these processors are what I’ve always looking for, on energy, calc power, peripherals, and FINALLY on IDE (editor, toolchain and utilities)… AMAZING!!!”
The topic is about how to start learning developing for ARM. Alessandro agreed to make this public, so I thought this might be a good topic for an article?
Long time ago…
I started assembly programming on a MOS Technology 6510, Motorola 68000 and then writing my first C compiler for the Hitachi H8/500 and Motorola HC11, and many other architectures followed. And as Alessandro, I have been pulled into the amazing and powerful world of more advanced processors as they came out over time. I have been pulled into the fascinating world of IDEs like Eclipse. I had the pleasure to work on many microcontroller and processors, but in retrospect these early processors and microcontroller are my favorite. They were rather easy to configure and use compared to the modern ARM Cortex microcontroller. And that’s the point of Alessandro too:
“Here my point: An ARM Cortex-M3/M4 microcontroller has a datasheet that is roughly 1000 pages long.
I tried to understand from this and the example code, “shipped” with the board, the “configuration phase”, but unfortunately often there is a lack of information that can’t make me understand exactly why the manufacturer wrote that line of code, instead of this…
Many things are taken for granted… Nobody taught me on that ;(“
So it is not only me thinking that there is a problem. Maybe all the ones who grow up like me with the technology take things as granted. As as the ‘older’ ones know a lot of the legacy, it is easy to forget that not everything is obvious. It is easy to I know ‘too much’ and to forget about how much experience and knowledge it takes to understand and use these modern architectures. As for myself, I have realized that my early tutorials were much more basic than the most recent ones. I’m getting too advanced?
As for the ‘1000 pages long data sheet’: Alone the ARM Cortex-M information is several thousands of pages. ARM microcontroller vendors like STM, TI, NXP, etc are integrating the ARM IP (core) and adding their own special peripherals, so why explaining how the ARM IP is interlocked? Go look at the ARM web site! But not everyone is knows already how the NVIC (see my articles series) works, but it is essential to use the device. It makes sense that this information is not copied over and over, but for someone starting with the device it is not clear and a difficult start. For example the NXP K64 Sub-Family Reference Manual has 1789 pages and does *not* include anything about the ARM Cortex itself: it is only about the peripherals (UART, SPI, etc), and all the other important things are behind a ‘yellow’ box with some references to NVIC, etc:
Definitely not an easy starting point for a newcomer.
Of course the number of pages does not say anything about the quality of the documentation. There are interesting comments on that subject in a Hackaday article (http://hackaday.com/2016/10/27/qualcomm-buys-nxp-in-largest-ever-semiconductor-deal/). One of the comments references the M68HC11 reference manual (http://www.nxp.com/assets/documents/data/en/reference-manuals/M68HC11RM.pdf): The reference manual has about 650 very well written pages. It includes everything I need to know about the device. Plus there was this small programming reference guide with package information, instructions encoding and all the peripheral settings (ADC, SCI, Ports, etc):
In contrast to this, below the ARM ‘bible’ written by Joseph Yiu: more than 800 (excellent!) pages. But only about the core itself:
Should I learn Cortex?
Yes, the Cortex-M (ARMv7) has a different complexity than the HC11, and the ARM Cortex-A would be even more complex! So maybe the ARM Cortex-M is even the right starting point to learn a microcontroller?
We are having a similar debate at the university: The university entry microcontroller course is using the Motoroal/Freescale-now-NXP-soon-maybe-Qualcomm HCS08 core. It is a rather simple 8/16bit microcontroller, easy to understand, no pipelines/caches, with straight-forward peripherals, a perfect device to learn a microcontroller. While that S08 is produced in high volume, it is a proprietary core, and more of a thing of the past. Most vendors have now moved on to produce ARM based devices. Very likely the students will end up using an ARM in their workplace or in their projects. So using the HCS08 from a learning and didactic point of view makes a lot of sense. But it does not from a marketplace perspective.
I want to understand it
Alessandro makes more good points:
“In particular, I would be really REALLY glad to understand how to configure (using just code, not others graphical tools -> I want to understand it!):
* The microcontroller clock for CPU, peripherals and others uses… starting from the datasheet pages to the code.
* The I/O for general purpose, and peripherals…. starting from the datasheet pages to the code.
As soon as I understand I will be happy to switch to the plenty of the graphical tools available to do that job!
I know that you are very very busy, but I assume that out there are many people like me, that still have the same problem to switch to ARM microcontroller platform, and we all would be very Very VERY VERY GRATEFUL if you could write a great (as always you do) article on these topic.”
I absolutely agree: I have to learn and understand the basics before going to the high level tools or something like a graphical pin configuration tool:
This is not necessarily only true for graphical configuration tools (which are a great helper!), same for using things like a compiler: without having a understanding how a microcontroller works, how it uses assembly instructions to perform a calculation, using a high level language compiler is like ‘flying in the dark’. Clearly, it does not mean that we have to go back and start doing everything in assembly language, but a good developer should have a good idea what gets executed as a result of his source code compiled and running on the target.
Same thing for all the graphical configuration tools like Processor Expert, STM32Cube or all the configuration and muxing tools: I have to know how things are supposed to work behind the scenes to make good design decisions. And the tools are here to help me, that I don’t have to memorize all the thousands of pages of the data sheets and reference manuals. It needs both: the ‘helper tools’ and a solid reference manual.
However, how the clocking or the peripherals like UART, USB, SPI or I2C are working, this is very vendor specific. There are common concepts like clock gating, prescalers, buffering, interrupts, latching, latency and configuration registers, but the implementation details are vendor and even device specific. That means in the extreme case these kind of things have to be re-learned for every new device. On the plus side: what I have learnded about the ARM Cortex architecture can be re-used.
What about all the examples from the microcontroller vendors? They can be a good starting point. But I find many examples are not well documented, and if there are comments in the sources, they tell me the ‘what’ and not the ‘why’. Or the examples are not doing what I believe they should do: I mean how much sense makes a ‘blinky’ example which does not use the LED on the board?
In the past I have seen good application notes, but then it is challenging to find the source code for it. Or the sources/projects are outdated as the tools used do not exist any more or have been replaced by other and incompatible tools.
I wish there would be more books (see Books section). I like to have a paper version of good information. I like to make notes on the pages or add sticky notes. But maybe I’m alone: I only see few students printing the lab instructions or the scriptum (why?). Publishing books is risky and expensive, although there are new ‘on demand’ models which sound interesting. Still a fundamental problem would be: to write a good book about developing for a processor or board, it gets very fast very board/processor/microcontroller specific. So if it is not for a very popular board (say the Raspberry Pi), probably the market is too fragmented and not worth the effort.
A great starting point is to use one of the many (and inexpensive) evaluation boards:
“In the past I bought the LPCexpresso from embedded artist with the LPC1343, and tried to start with this… But as soon as I searched something on the web about configuration I quickly realized that ST32 is more popular, then probably I should have bought that!!! 😱. Personally I would be quite happy to use the NXP..”
Here there is another challenge: there are so many of these boards, that only a few really get widely used and supported.The silicon vendors are producing boards for nearly every device they produce for ‘evaluation’, but not for ‘development’. Too many, and too different so only few boards get critical mass to be used as community board. The exception to this are for example the Arduino boards which are mostly compatible, or the best example for me are the Raspberry boards. The Raspberry foundation has put a lot of efforts to keep things compatible and that software runs on all boards. Something I don’t see on other boards. And with this, it is much easier to learn and find things for a Raspberry Pi than say for a LPC1343.
Software and Tools
Another aspect for using an architecture is the availability and quality of software development tools:
“One another BIG problem that I had was about the “toolchain”.
When I bought the LPCexpresso it was “shipped” with the CodeRed compiler. Onestly for me has been a big mess understand that there are 3 main compilers: ARM, IAR and GCC. Now it is foregone conclusion, but when I started it wasn’t… I didn’t want be stuck (learn) one compiler which could have been under a payment option.
Second problem was about a good IDE. I wanted to start with the best and scalabile option, but at that time, and probably even now I’m not really aware about all the pros and cons that one has vs another…”
So the good thing in the ARM world is that there are plenty of choices. The bad thing in the ARM world is that there are plenty of changing choices. Several vendors including ARM Inc. are providing multiple (competing) tool chains and IDEs. And with all the mergers and acquisitions things are constantly changing too. The only good thing on the IDE side is that many tool vendors have standardized on Eclipse as an IDE: so once I have learned Eclipse it is very well invested, as I can reuse my knowledge for that IDE from that other vendor. Until a new ‘standard’ will emerge.
Is it more difficult to develop for ARM devices? I think that depends. Personally, I very much like the ARM architecture. But in my opinion it is easier and simpler to develop say for Intel. I don’t like the Microsoft+Intel combo, but they seem do a better job to help the developers with good software and tools. ARM Inc. tries to catch up on this with mbed/CMSIS and other initiatives, but it is still fragmented and inconsistent. I think the strength of the broad ARM ecosystem is its biggest weakness: resulting in fragmentation, inconsistency and constant changes.
On the other side: because so many vendors are using an ARM architecture in their devices, once I have learned the architecture, it is easier for the next device. I still have to learn the different peripherals, but at least the core remains (mostly) the same. Similar like Eclipse as IDE: most vendors are using it, and if I have learned it once, that learning time is well invested.
Silicon is getting more and more complex, and with this the software and tools. To me, hardware and silicon are important, but even more important are the software and tools.
What counts at the end is how fast I can finish my project/product with unique functionality. And if I can reduce my learning time, the better. With all the mergers and acquisitions going on, hard to say where we end up in 5-10 years. I feel the success of an architecture depends how easy it is to get new users like Alessandro up to speed.
I know that the topic discussed here is controversal, and up to different options. But I would like to thank Alessandro for bringing up his thoughts. And that he wants to be on that learning journey, and he brought up one of my favorite quotes:
“I hear and I forget. I see and I remember. I do and I understand.” – Confucius
Happy Learning 🙂
PS: I guess I should start writing a series of ‘basic’ tutorials as I wrote them 4 years ago? I wish I can find some time.
PS2: And I would like to thank you all for your comments, I have learned a lot that way. So keep the comments coming 🙂
My time doing our first development with KE06Z was the most challenging start for any project in 25+ years. Different and long documents for peripherals, core, tools; working out bit manipulation/ read-modify-write; realizing to turn on clock before using a peripheral or else the processor resets …
But before that the PIC24 was a challenge too because of all the I/O port mapping for example – useful but longer to learn before any good results.
I guess the first ARM project is harder, but the next after that should be easier?
maybe you had some bad luck. To me, the KE series is ‘the ugly child’ in the NXP Kinetis family. You will find less support for it, and I think this is because its peripherals are very different from the rest of the Kinetis devices. I was really interested in the 5V capabilities, but these kind of things have turned me away using the KE family. And yes, I think after this first ARM at least some of what you have learned (mostly about the core itself) will be applicable to other microcontrollers, so it should be easier.
That’s a really great blog post! I’ve been hopping on / off the embedded platforms for a while. But most of the times I gave up, pretty frustrated.
I think there are a few problems why development on the ARM Cortex-M chips is so difficult:
1. The absence of libraries which provide the right level of abstraction. Arduino is a great example how easy uC development can be. Arduino comes with good default settings and an amazing set of libraries which “just work”.
2. Documentation & Tutorials. Apart from your website it’s really difficult to find good examples and tutorials for Cortex-M silicon.
3. Community. I haven’t found an Arduino like community for ARM based boards. Every vendor has it’s forums, but I think a community is much more.
I think that https://mbed.com is a nice try, but my impression is that it’s just driven by a few individuals (although they might be paid by some of the silicon vendors).
My tools of choice are currently http://platformio.org + Atom/Visual Studio Code. Despite of having a few Cortex-M dev boards lying around, I currently accepted my faith that I either use an Arduino or a Raspberry/Orange Pi for my twittering garden sprinkler side projects. On both, I’m productive within a few minutes.
Granted – I’m purely talking from a hobbist perspective.
Keep up the good work Erich – maybe I will give ARM Cortex-M in the future another try 🙂
1) You give a good example with the Arduino libraries. They are so sucessful because they base on the ‘wiring’ foundation. That makes the drivers common and uniform and easily understandable. In contrast, the silicon vendor libraries seem to make things different on purpose somehow, making it really hard to use them from one microcontroller familiy to another. On the other hand the Arduino libraries can be hard to be used with the Arduino development tools. What I ended up is porting libraries to native tools so I can debug it. And there is the CMSIS-Driver with CMSIS-Pack initiatives by ARM, but I have not seen it really taking off (yet?).
2) Because I have found tons of tutorials for Raspy and Arduino, but nearly nothing for Cortex-M, that’s the biggest reason why I have started this blog. Maybe the hobby developer is much more open to share his thougths and findings than the ones using Cortex-M?
3) Agreed, there are the vendor communities, and really nothing scalable outside. And for the boards I think the problem are the silicon venders themselves because they produce too many boards, spreading the peanut butter too thin so no community can grow up.
And I share the same feeling about mbed. Only a few boards are supported, and most of them seems to be the result of vendor sponsoring and not of a large user base.
What could be the ‘next big thing’ are initiatives like Zepyr (https://mcuoneclipse.com/2017/01/22/zephyr-thoughts-and-first-steps-on-the-arm-cortex-m4f-with-gcc-gdb-and-eclipse/) or maybe even better as this the Apache mynewt (https://mynewt.apache.org/).
And you are right about using the Raspy/etc: lots of processing power, vibrant community, inexpensive board and I can get things up and running in a very short time. That’s how things should be.
LikeLiked by 1 person
Yes, I agree ARM is more difficult to understand ~ and thankyou Eric for sharing so much about it. I started with a 6800 development suitcase, with a teletype interface. Typing the assembly commands in – and then couldn’t figure out how to start it for some time – no tutorials. So ARM is more complex – but with the internet the tutorials and tools are much more widespread. Having the ARM Cortex M across a number of manufacturers is excellent as don’t have to relearn specific architectures all the time.
I continue to wonder how the oldies among us where able to get all the needed information without the internet? Maybe that was the time were good (printed!) reference manuals and books were worth the money.
I started with a F38E70 and yes, manuals, books and vendors’ support were then enough to fully understand and develop simpler devices. But imho we cannot approach a modern processor like ARM in the same way. New devices have a growing high complexity that is no more possible to master completely developing in time a project. An application programmer needs underlying drivers layers that do the low level job for him and free him from studying thousand of pages. PE has been the best ever solution.
Discovering the Kinetis design studio , processor expert and the sdk along with this blog have been the catalyst for my selection of the Kinetis devices for around 5 commercial projects in the last 2 years. Prior to that I had looked that the LPCware libraries and installed the eclipse ARM IDE’s from other vendors but the mountain to climb became too big so I continued with AVR & MSP430. I could develop end applications for these devices without the use of a vendor supplied SDK, doing the same with Cortex M4 devices would be significantly more challenging. Conversely, I could ask the question: Would developing tomorrows devices and applications using older 8/16 bit Architectures be more difficult than for ARM?
Using older 8/16bit architectures would be simpler in my experience. The challenge will be to still get all the tools in place say for Windows 10, as vendors seem not to invest much engineering into these tools to get them moved into the next decade.
The other (maybe more challenging part) is the following: I have seen several cases/projects in which an ARM was proposed to be the solution. That was accepted by the management with no questions asked. But whenever a non-ARM architecture was proposed for the project, managed asked a lot of questions and justifications why this has to be a non-ARM core. I thought that was interesting.
LikeLiked by 1 person
When kicking the tires on my K64 based boards, I had the same problem with the examples provided with KDS. I just wanted the most basic hello world blinky type project, but there was none. I wanted to create a pure program foundation by building one up myself. The problem I faced with getting my own program working was an initialization of the output pins beyond just configuring them. I finally pulled it out by looking at a few examples, but the closest example of a blinky plus accelerometer was way more complicated than just a very basic example needed to be.
Later on with that same project, I started looking into using more efficient ways to write the state of multiple pins and started going down a hole about Bit Banding, which is documented, but much of it is written at a high level that assumes a lot of prior knowledge. So it really needed a tutorial like ones found here to restate it with more context or practical examples. That is one of those features that the manufacturer doesn’t document other than to tell you it is available and what the address range is. Then you wind up digging through the ARM documentation to learn the vocabulary to finally search and find a tutorial.
This also gets into another area of discussion about where the tools are heading in the longer term. If we will see more usage of microOSs to handle the basic setup and we rely on there being enough core memory to do so, or if things turn more abstract with more visual programming like what is popular in the Arduino space. Also, is there another generation of MCUs that are going to make a big shift in the same way that the change from 8-bit to 16 and 32 went over the past couple of decades? It feels like we have hit a similar space to computer processors where there isn’t a big pain point on features or cost or a looming big technology bump.
I started writing code in assembler for 6800 and 6502. Having 4K, then 48K or RAM, I developed a writing style almost devoid of comments, as that way the slow floppy disk accesses were minimized. Now, with almost limitless memory, I have disciplined myself to add comments that help me understand what’s going on. Having been forced to use code and APIs supplied by manufacturers of specialized parts, such as DVB demodulators and tuners, I am always astounded at the almost total lack of comments. The same is true of a lot of the framework that the likes of Kinetis etc. supply.
yah, comments are a pet peeve of mine these days. There are a lot of people who do not comment code at all. Perhaps they figure that they will never come back to it again or that anyone else will ever have to look at the code, I don’t know.
Personally, I can’t code without comments anymore as it’s part of my stream of consciousness coding.
Good engineering practice requires good comments, file organization and naming. That’s why I loved the uCOS operating system: it was easy to understand that way.
I’ve found that the Teensyduino environment is a good starting point for programming the Teensy boards (which have some of the more popular NXT ARM processors). I quickly changed to using direct register access for the time-critical stuff, but did not have to worry about things like getting the clocks started correctly.
My son, who used a KL02 in his first commercial product, preferred working with the gcc toolchain and doing everything himself—but he was trying to minimize flash and RAM to save money on the processor. The overhead of almost any of the available run-time systems was too large. (On his second commercial project, he went for much cheaper non-ARM chips and did everything in assembly language, because there wasn’t a decent free C compiler—no working code generator for gcc even—I wouldn’t have had the patience.)
I think it is all about what a given project needs, and clearly one approach will not please everyone. This ranges from using interpreting scripting languages down to the pure assembly programming.
It depends on the level of control needed and how hard or big the problem is to be solved. To me, the important point is to have a choice, and that I can use what fits best. What I don’t like is if vendors are building ‘walled gardens’ with no easy way to get out (or in). I love the Teensy boards, but they are one of these ‘walled gardens’ or ‘golden cage’ too with that proprietary bootloader and Teensyduino environment.
This is a great blog post. Most people lack a path to follow, and in messing around, they forge their own paths. I started off with Zilog 8-bit controllers. Like PICs, they are easy and straightforward to configure compared to ARM. So what I did, and what I am still doing is, although I love using eclipse IDE and Processor Expert to generate code, I read the datasheet(Only a single chapter for a peripheral, plus the SIM module chapter for integration) and try to configure the peripheral in question. I began with the GPIO. I was confused about the GPIO and PORT registers difference. I mean I expected them to be one and the same, turns out you have to configure both places to achieve what you want. I also started with Bit-fields using structs on Zilog:https://github.com/Muriukidavid/z8_64k/blob/master/ez8_64k.h, when I came to ARM, I learned that those are not portable and should be avoided completely. I ended up working on the TWR-K60N512 and so far I have most of the peripherals working including GPIOs, UARTs, Infra-red, ADC, TSI, Timers, and I still have USB, Ethernet and CAN to go, it’s been a great experience.
I teach my students to learn the baremetal way before jumping to IDEs, hence twrk60n512-blink-blink-blink-already by Lexy Andati. I know the need to actually understand whats happening in detail instead of getting code your can’t understand generated by some tool, or application notes. If you later use code generators, you can then understand that code at least. It is not easy, but its very rewarding to learn the basics.
After understanding the baremetal code way, I think that the choice of method and tools(how to approach each project/task) should be made depending on amount of time available and how quickly the results are required. I use both quite comfortably now.
In my courses, I start with bare metal too to teach the basic concepts (interrupts, events, reentrancy) and only then go up to the higher level with RTOS, tasks, etc.
I think the same applies for microcontrollers: it is good to start with something basic and simple, and only then go to the more complex thing. Naturally, the older ones of us started with the simple things, but new developers might jump on directly on the complex ones. On the other end: even the most complex things can be approached from the basics, it is all about how to divide the complex problem into smaller pieces which can be handled easier.
Hi Erich, hi all.
I read the article and of course all the comments.
I start to feeling a little bit better to be not the one that had problem to start with ARM.
In the end I have to be honest with myself: Every time I learned something (especially at the university), has been when I fleshed out the topic. When I tried to started with ARM (years ago) I thought that it would been easy as with the PIC micro, so instead of analyze the code with a fine-toothed comb, I searched quickly “answer” on the web.
As a result I wasted my time, and I’m still here : (
So I think that this is the one : )
Tonight I downloaded the latest LPCXpresso, and the examples.
I’m running the code step by step, and writing down all the function calls, registers and values about the oscillator configuration with the help of datasheet.
I hope to understand it quickly and well.
I can confirm that you are definitely not alone with this experience. The good thing is that these kind of things are a good learning experience, so just keep up!
Is there an alternative for KE Kinetis family in NXP portfolio?
I am looking something more supported. It has to be modern one (not HC16) and comparable price.
KE1x has support for SDK, but did not tested yet if this make development easier.
I have also looking in to EA family, but did not found SDK support here.
Ia there any alternative for kinetis family? Anyone know?
I didn’t find an alternative in the NXP families.
It looks like Atmel and Cypress http://www.cypress.com/products/fm0-s6e1a-series-5v-robust-arm-cortex-m0-microcontroller-mcu-family both have something similar, but I personally don’t like using Atmel for production designs because their lead time can be variable and long and I don’t have a good feel of Cypress part availability.
What about drivers (like PE/SDK) in Cypress, do you find it is usefull?
Hi erich, Great article. I am curious if we have some CMSIS compatible BSP distributed by silicon vendor, can we use them interchangeably as the HAL would take care of the differences? I have a study project in mind which is to evaluate 6 RTOS’s on KL25z to compare the performace. If this is compatible to CMSIS I can make it compatible of some other boards as well by just adding support for CMSIS BSP for that board.
Thanks! 🙂 It would depend what you mean with ‘CMSIS compatible BSP’. To my knoweldge all vendors have some level of CMSIS support (at least CMCIS-Core), but not all provide the full CMSIS (or none as far as I can tell provides the full CMSIS support, maybe with the exception of ARM itself).
Hey you guessed it right. I meant full CMSIS support. but to me if I am able to make it CMSIS core compatible, and while changing MCU we can just add/change some differences and get going. I agree this is very idealistic expectation. But I will still give it a try. Thanks for input. Can you please enlighten what you meant full CMSIS vs CMSIS-core ? from my understanding CMSIS-core will have most peripherals register but not high end peripherals like USB , ETHERNET etc.
See that image on https://www.arm.com/products/processors/cortex-m/cortex-microcontroller-software-interface-standard.php with all the different parts. CMSIS-Core is only about the ARM core registers and functions, e.g. to set an interrupt level, accessing the SysTick, etc. For the perhipherals/drivers you would need CMSIS-Driver, and CMSIS-Pack as delivery mechanism. See https://mcuoneclipse.com/2016/02/14/are-arm-cmsis-pack-the-future-of-software-components/
Great article Erich. As someone who has progressed through programming Z80, 6502, PIC, dsPIC, SHARC, etc professionally over the last 30 years, I have found the transition to programming ARM a big challenge. Everything seems much more complicated, for what I perceive to be relatively little real gain, at least for the low end M0 and M0+ processor types. I’m getting there now, but it never ceases to amaze me how complicated otherwise simple tasks are becoming with modern micro dev tools!!! I want to spend more time worrying about how to write efficient code for my micro, than chasing down some obscure error that my debug tool keeps throwing, or trying to work out why Windows 10 bricked my dev board before I even got chance to fire up the IDE (>.<)
LikeLiked by 1 person
thanks! So yes, it has been a complicated journey for myself too. And looking at the more modern M23 and M33 things have been even more complex by now. The good thing is that at least knowing the architecture (leaving all the vendor specifics out), things could be applied to other devices within the same (or close) family of devices. But I think ARM just did it wrong with adding more and more complexity, and in a connected world this means more vulnerabilities and issues too.
With this in mind: I’m wondering where the journey will go with RISC-V: to me, this architecture will over the long term probably replace ARM, and we have to see what it means for complexity. There the issue with RISC-V is that it gets a lot of diversity, for good and bad.
Did you had any exposure to RISC-V already?
Hi Erich, thanks for the reply. I haven’t encountered RISC-V yet, but it looks like a recipe for an even more fragmented new “standard” to emerge! I can imagine the Asian chip manufacturers running with it though to avoid restrictions and royalties to the west. I just hope that ARM stays around for a while after investing the time to get up to speed with the platform. That is when the chips actually become available again! (BTW, thanks for your excellent articles on unbricking the FRDM-KL26Z boards, without which I would have surely bricked a couple more boards before figuring out what was happening! I’m surprised NXP didn’t deal with this better. A warning sticker on the box is all it would have taken!) Now to start playing with my new FRDM-K22F Cortex M4 board. I must be a sucker for punishment! -Richie,
Yes, imho the added diversity on the RISC-V variants will be a big challenge, as soon it requires tool chain (compiler, linker, debugger) changes: very few are able to make the needed changes. And yes, I think as well that some countries will use it to be independent from ARM (or Intel, …). The same time I see mid and larger companies investing into RISC-V because they have been deeply disappointed by the traditional ARM MCU providers, because they were not able to deliver parts in this continued chip shortage: having the ability to produce your own RISC-V device (see for example Seeed) could be an advantage. Still you will need a factory, but that’s way down of the dependencies as of today.
And thanks for your note about the FRDM boards. I don’t understand why still boards are shipped with that faulty bootloader/firmware on it. The FRDM-K22F is a great one, and you are lucky that you have one in your hands: I know that stocks of FRDM-K22F have been harvested from the inventories just to take the K22FN512 chip from it! We are desperately looking for them too for our projects, and we received quotes of up to $600 a single MCU! Yikes!