Have a read at http://embedded.fm/blog/2017/8/12/dont-use-arduino-for-professional-work.
I love the Arduino ecosystem, and it is great for getting something up and running quickly for a ‘proof of concept’. But it stops at that point.
I think it should be obvious why Arduino (code and libraries) should not be used for professional work. Especially the lack of proper debugging support makes it nearly impossible to solve the problems of the real world.’printf()’ style of debugging is simple, but it is a huge waste of time. I have seen too many student projects failing because the inability to properly debug the system and solve the problems.
Even equally important for professional work is the topic of IP and licensing. Be aware of the licensing terms and conditions as pointed out by above article. If not, your product easily get GPL’d which might not want you want.
Happy Arduinoing 🙂
The Arduino community is also an excellent place to find examples of the ‘expert beginner’ syndrome:
https://www.daedtech.com/how-developers-stop-learning-rise-of-the-expert-beginner/
The community is full of helpful people who have gained experience working in the limited Arduino environment but don’t know about the larger embedded world beyond and are all to eager to lead others into the same dead end.
I always tell people that Arduino is a great place to start, but that if they want to learn embedded systems and not just use it to accomplish some other specific goal, they should plan on it *only* being a place to start and expect to move on when they’ve got the basics.
It’s still a great platform for artists and hobbyists who have a project in mind and don’t want to learn the ins and outs of clock generators and linker configuration files.
LikeLike
It saddens me to see that even so called “professionals” fail to do a fact check and are willing to step into some pointless rambling such as this one.
Pretty much all the points on the cited article are simply *NOT TRUE* and your reiteration of certain points don’t make them any truer. It starts with the misconception what Arduino actually is: there is no single meaning of Arduino, it’s the name of a company, the name of a (standardised) form factor, the name of a hardware, the name of an IDE and the name of — what I would call — an eco system.
So what are we actually talking about here? Most points seem to be focused on the IDE which is bizarre because it’s (on purpose, because it’s dumbed down to the max) the weakest point of all and not even required to do anything meaningful with the Arduino hardware or the Arduino ecosystem, which — like it or not — blows ANY other MCU ecosystem on the market out of the water. Don’t like the IDE? Then use Eclipse, Atmel Studio, your favourite IDE with platformio support or (as the real pros would do) vi; the Arduino IDE is nothing more than an editor, a gcc compiler and a few other tools combined into an (admittedly lousy) IDE, just like any other. Want to use the ecosystem? Then do it. Want to debug? Great, get yourself a debugger, connect it and use your IDE or gdb to get debugging!
And what is this utter nonsense with the GPL? The original article claimed that most of the eco system is (L!)GPL which is pretty much the only sane license in the GNU world. All that it says is: You can do whatever you want with the code but if you make any changes to it which you use in your own code, you have to contribute the changes to the foreign code back to the community (in the form of publishing the changes), which is exactly what the community wants and absolutely harmless to any “professional”. Unlike the GPL the LGPL is not infectious, your code stays your code and there’s no chance in hell, as you claimed, that your product needs to be GPL licensed. As usual (and as every professional will know) if you utilise other work in your product you will need to study the license and make sure you’re allowed to use it as intended; if you pull in or link against GPL code (and that can happen to you anywhere, not just with the Arduino eco system!) then you simply f*cked up and may not live up the the self-claimed title “professional”.
LikeLiked by 1 person
I would rather think the main issue with Arduino is the very source code. It is not C, it is not C++, but some mix. It doesn’t follow standards so people don’t learn either language proper. And since all hardware drivers are hidden away, you won’t learn how to write those – which is the very core of embedded systems development. Neither will you learn about best practices when interacting with hardware, nor all the “bits and bytes”, how machine code is generated from high-level source, and so on. It isn’t -meant- to be in-depth. But professional use does require in-depth knowledge – you can’t release a product and then when there’s bugs blame some Arduino library for it.
LikeLike
Eric
I tend to work on about 3 or 4 professional projects a year that have been (originally) developed with Arduino.
Usually to rescue the project or add parts that are not available in that environment.
Some of the companies who have developed and are trying to market these Arduino based products are big, well known international companies(!)
They use no debuggers, their programmers/developers/engineers know little about embedded development and often believe that the processor can only operate together with the on-board Jtag loader chips.
Arduino is essentially a script-kiddy environment at its root allowing anyone to load a script (maybe after a small code change to give the extra kick) and see it running on their board with virtually no further knowledge. Many other modern tools try to do the same – allow engineers with less and less knowledge to do a little more (but maybe especially as marketing tool by semiconductor manufacturers to lock in engineers to a product that must be so easy to develop with – and of course completely free or charge…since that is the most important development decisions for many organisations (don’t worry about putting a consultant on it for a few months because it cost nothing to use….). The result is that the engineer is dumbed down even more. They can very quickly get something started but their overall time to get it finished (and with a reasonable quality) can be much longer (if achievable).
I have been shocked by some such industrial developments (the tools used and the lack of embedded knowledge leading to high costs, poor products and late delivery). I also give support to many companies around the world and am constantly shocked by the fact that many development engineers simply have too limited experience and understanding of debugging. This makes them lose huge amounts of time. I question seriously the present trend of all the code generation/configuration tools, especially used in education – they maybe allow a quick start but at some point the developer needs to understand what he or she is actually doing. Based on what I regularly see this understanding is seriously lacking in young developers today which is why I believe that also the education system must be at grave fault too.
Regards
Mark
LikeLike
Hi Mark,
Thanks for your input! This is exactly what I’m seeing in the industrie too. I talked with the product lead of a known brand product about their needs for engineers. The response was shocking as they feel they do not need any hardware and software engineers “as you can don’t need to write software as you can use all these mbed and Arduino libraries”. I believe this originates from the fact that more and more of these ‘decision makers’ never did real development or never had to solve a hard problem or use of a debugger, logic analyzer or oscilloscope.
And I agree with you that the education system has a problem too, and again to me the problem could be because the educators/teachers rarely come out of the industry, and they think that printf() is all what they need (and then (if they realise it at all) are shocked by the impact on execution time, code size and RAM amount needed). I agree that engineers need to learn and understand what he or she is actually doing. Code generators, libraries and the like only make sense once I have understood what is behind it. But this takes time and efforts. That’s why I shake my heads when I hear about a beginner programming course for you engineers at a university which is labeled around “we build a full an IoT application”. Sure thing that management probably heard about ‘IoT’ and thinks that must be in any case a good thing. I’m worried about the next generation of engineers if this is really the way going forward.
Erich
LikeLike
I have no experience in the embedded world, but this really reminds me the sterile diatribe about low-level programming languages and high-level programming languages.
It’s clear that high-level, very abstract languages allows you to be more lazy and dumb and anyway obtain your purpose the quick-and-dirty way.
Is this a good reason not to consider them “real” and “professional” languages?
There are technical differences between various embedded platforms as between various languages. A professional will make informed decisions based on these technical factors, not based on “how easy is it to reach my target being lazy?” or “how impossible is it to get anything done if I do not memorize any detail of my hardware before even writing an hello world?”.
LikeLike
Hi Daniel,
thanks for your thoughts and input on this! I think the point is about ‘the right tool for a given problem’. What I observe many times is that if a tool works great for one problem, it gets used for ‘all’ problems even if not adequate. I think a ‘professional’ approach is to select the right tool, together with an awareness what might come. It might be ok to cut a single and small tree with knife, but if you have to cut many trees and maybe a whole forest something different is needed.
This applies to programming languages: for a small problem a small script is probably the right thing. For a very complex and distributed application you probably need something more powerful. Again not using the tool which is just in hand, but something which is able to do the job, as well long term.
LikeLike
Hi Erich, I hope you’re doing fine and as always thank you for sharing your experiments in the blog (the laser cutter has been very interesting).
Just wanted to remind you that this blog is not only useful for professional embedded engineers but also to hobbyists like me.
So I finally found a post that somewhat relates to a question that is nagging me lately.
I’m not a professional engineer at all, but I found my way into this field through Arduino. And so I started playing with it and doing some projects.
Some months ago I decided to create a thermocycler where my Arduino didn’t deliver on a certain specification (ADC resolution). Instead of buying an Arduino Mega, my brother told me that he was working with some NXP hardware, and that NXP gifted them some Freedom boards. He told me to try them out. (He’s an embedded hardware engineer)
Safe to say, it was a complete nightmare at the beginning. Just setting the thing up was horrible (coming from the right-out-of-the-oven-cookie-Arduino-approach). And somehow, digging through the NXP forums, I found your blog.
And your blog saved my life as an introduction to these platforms.
I wouldn’t have learned so many things regarding the memory usage of printf() or debugging. I even learned a lot about how info gets stored in the microprocessor and how it’s read.
I also had to immerse myself in the manuals of the ADC or trying to understand the clock system (really difficult) or how the buffer works in UART communications.
I even used some of the processor expert components you shared with buffer-safe character reading and writing. They were extremely useful.
Anyway, lots of learning.
So to avoid this becoming an essay, my question is this:
Is there a certain point where to stop using Processor Expert in an application?
I have finished my code and the thermocycler works as expected. I have a nice ADC conversion (minimum noise), and I’ve performed some PCR reactions and it works great. Generally I get a 0.5ºC accuracy of temperature (which for PCR reaction standards is within bounds).
I understand processor expert is a crutch (I guess). However, when should I start worrying about execution time, or RAM usage, or code size?
I haven’t dealt with any of these issues and they haven’t arisen at all.
Would you recommend it’s better to optimize this application as much as possible? Or, wait to learn how to work with this problems until another application starts to demand greater execution speeds or has RAM problems.
If you happen to have any feedback, that would be great.
Thank you Erich,
Leonardo
LikeLike
Hi Leonardo,
thanks for all your thoughts! And congratulations to your way with getting up to spead and learning new things. Indeed, the ‘Ardudino’ as a whole brought programming, electrical engineering and computer science aspects to poeple which otherwise never would have entered that domain of knowledge. Similar like Lego Mindstorm and others motivating young kids not only to consume things, but be creative and productive with creating something new.
Unfortunately, as in many other cases, without the proper education (either self-learning or in schools or universities) things might not be easy and simple all the way through. A hobby pilot most likely is not doing the same and not using the same equipment than an airline pilot, and has not the same education. I’m not saying that the airline pilot flys better, I’m only saying it is different.
To you question: “Is there a certain point where to stop using Processor Expert in an application?”
This all depends on your application and where you want to go: many users of Processor Expert use it to generate code which works, and copy/take it from there. Others are using it as an additional reference to the manuals and data sheet. But it does not stop there, you can use it up to the full level of application development. There is however one thing with all these tools: you have to have at least a basic understanding what is going underneath and below. Using the ADC or I2C component without understanding the ADC or how I2C works will set you up to failure. So to me you can use it up to the level you understand the underlying hardware. Maybe not every single and hidden bit, but the general concepts and how things in general work. Same thing about execution time, RAM or FLASH size: you need to have a pretty good understanding how the programming language (C, C++, ASM) works and what kind of code it generates.
If as in your case you don’t have a problem, then you should not worry about it upfront. You will get them sooner or later :-).
As for execution speed or RAM usage: this is a very general problem and question. In general my answer would be that you trade in more FLASH and/or RAM with faster execution speed, and vice versa.
I hope this helps,
Erich
LikeLiked by 1 person
Thank you for your feedback Erich! Will definitely need to learn much more about the underlying hardware and software. Perhaps slowly removing the training wheels will make me learn because of pure need ha.
Somehow WordPress never told me you had already written an answer. I just noticed by coming directly to this thread. No idea why. Just a comment in case this happens to someone else.
LikeLike
Erich,
Totally agree. Just a question. Would you make the same statement in the case of RaspberryPI?
Sometimes I think there is a bit of misunderstanding with the ‘professional’ term being used. To me, an embedded HW/SW professional is someone who earns his living by working just like that, an embedded HW/SW professional. I understand that some people may feel offended if they or their work are not considered professionals but keep calm. Someone can be an excellent embedded C/C++ developer, masters most of debug techniques,… and working in banking!
regards,
gaston
LikeLike
Hi Gaston,
For me, the term ‘professional’ is the opposite of ‘hobby’. Professional is if you earn money or make it for your living. Professional means delivering goods and services, and having a legal and contractual relationship. This includes warranty and all the other legal obligations. To some level it means for me as well that someone has had an education or degree too. As for everything else, there is no black and white, but lots of gray as well.
So for the example you gave with the banker, I would think that this is more of a hobby for him. And to me, doing something for a hobby should not be considered as a bad thing. For example I’m doing this blog as a hobby too.
You gave the example of the Raspberry Pi: would I use this in a ‘professional’ project? It will depend (availability, stability, temperature range, long term supply and maintainability). Most likely I would consider an ‘industrial’ Linux module instead. If my business would be selling 100 home automation systems to the hobby market a year and I don’t care what hardware I need for next year, then I propbaly would go with a Raspberry Pi too. Same kind for all other ‘evaluation’ boards like the Arduino boards, STM Nukleo or NXP Tower/FRDM boards: I would use them for a proof-of-concept phase, and most likely not consider them for a ‘professional’ product unless the scope and duration is very limited.
I hope that makes sense,
Erich
LikeLike
I had a similar conversation a month ago. A real difference between hobby/professional is the ability to scale to market without going bankrupt.
Means all the bugs and corner cases needs to worked out. The design has to be able to be manufactured and tested cheaply at scale. And it needs to be reliable enough in the field that the number of failures due to your product is lower than the number of failures caused abuse or confusion on the part of your customers. Noting that you need to build products that a robust and generally just work in the hands of the customer.
All that stuff is often the really hard part.
On the other hand I totally feel for people getting started as the traditional path has a really steep learning curve. I’m not a fan of pointing people towards a wall and expect them to just climb it.
PS: Did recently poke at a pyboard. Which runs micro python. Kinda nice in it just works.
LikeLike
I used Arduino about a year ago to develop an IOT project with that same mindset. Get it working in Arduino and then port it – how hard can it be.
Well for one, it was TOO easy in Arduino. There were so many example projects, code samples, auxiliary resources that it was almost toooooo easy to get the project up an running. Armed with my proof of concept, I started to solicit customers and as mentioned, it crashed.
A chasm literally developed between the POC that worked like childs play vs the potential production HW that I wanted to supply to customers. It eventually meant that a parallel development track would have been necessary so that when it came time to switch away from Arduino, I was ready.
Was it my shortsightedness that took me to that point, well maybe. But in the end I agree with the sentiment of the article. If you are going to use an Arduino, make sure you have an exit strategy or rather endure the pain of a longer development cycle using the Professional tools that you would have to use anyway.
That said, the myth of Arduino’s limitations is hard to dispel when you see the PCBs finding their way into so many off the shelf systems from China etc.
HavenTech
LikeLike