{"id":55761,"date":"2022-07-21T16:09:00","date_gmt":"2022-07-21T15:09:00","guid":{"rendered":"https:\/\/www.microsoft.com\/en-gb\/industry\/blog\/?p=55761"},"modified":"2023-05-04T12:52:38","modified_gmt":"2023-05-04T11:52:38","slug":"its-a-great-time-for-programmers-to-learn-low-code","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-gb\/industry\/blog\/technetuk\/2022\/07\/21\/its-a-great-time-for-programmers-to-learn-low-code\/","title":{"rendered":"It\u2019s a great time for programmers to learn low-code"},"content":{"rendered":"
<\/p>\n
Low-code\/no-code techniques have enormous potential<\/a>, particularly when bolstered by artificial intelligence<\/a> (AI). Several thousand pundits, thought leaders and consultants already have stories to tell on the theme that “even non-professionals can create apps in no time\u2026”. This success story is the fruit of many interesting social and technological developments.<\/p>\n By the late 1950s, COBOL and FORTRAN (as they were then capitalised) promised that programs would be readable by non-specialists. “Readable” was the salient criterion for the time, of course; English could barely express the idea that people in business might render their own digital content by typing until roughly 1975. “Typist” and its variations continued to grow as a profession for about another 15 years.<\/p>\n Low-code long ago succeeded, in the sense that the majority of programming is already done by non-professionals. Am I serious? Yes: by any appropriate measure, the world’s most-used programming language is Spreadsheet, as implemented by Microsoft Excel, Google Sheets and their competitors. At least a billion<\/strong> people worldwide use spreadsheets<\/a> to get results that would take years<\/em> to arrive if they had to wait on the planet’s 25 million programmers<\/a>. Low-code won, a long time ago.<\/p>\n This is a huge accomplishment that brings with it inevitable challenges. For example, several scientific researchers have independently analysed and published peer-reviewed scientific papers in genetics and clinical medicine, and accumulated evidence that between 30 percent and 92(!) percent of these have blatant errors traceable to common spreadsheet fumbles.<\/p>\n But these incidents testify to spreadsheets\u2019 success<\/em>, rather than failure. In each instance, non-specialists are producing results on their own that would have cost more, and taken more time, if they\u2019d waited for professional programmers. As embarrassing as mass-market coverage makes the errors appear to be, they\u2019re secondary to the actionable results<\/strong> that emerged from \u201cend-user computing\u201d. Non-specialists know little of the limits of the tools they use, and that\u2019s as it should be.<\/p>\n In any case, programmers have the opportunity to work alongside citizen developers and extend on low-code. We can build and enhance apps, automate processes and create virtual agents more quickly, helping organisations innovate and achieve their objectives faster.<\/p>\n Low-code also offers opportunities for developers to learn powerful new skills that can advance your career, earn recognition from your peers and attract employers.<\/p>\n Of course, even when the most richly AI-ified no-code performs perfectly and an initial demonstration or roll-out succeeds, \u2018base-layer\u2019 coding skills will remain in demand. Apps will still require installation, need to be reconciled with local expectations for security certificates or localised text or multi-factor authentication, and so on. A programming mentality will play a crucial role for decades to come<\/a>.<\/p>\n If “programmer” to you means someone who just churns out more of the same Java or CSS (or COBOL) learned at university, then, yes: low-code might well render such specimens extinct. If, however, you see professional software development as an adventure in life-long learning<\/a>, low-code will be your friend. It’s a great time to learn low-code<\/a>.<\/p>\n Low-code\/no-code techniques have enormous potential, particularly when bolstered by artificial intelligence (AI). Cameron takes a look at its history, and why it’s a great time to learn.<\/p>\n","protected":false},"author":430,"featured_media":31287,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"categories":[594],"post_tag":[1505,223,519],"content-type":[],"coauthors":[1842],"class_list":["post-55761","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-technetuk","tag-innovation","tag-skills","tag-technet-uk"],"yoast_head":"\nWhy programmers will remain in demand<\/h3>\n
Learn more<\/h3>\n
\n