Cameron Laird, Author at Microsoft Industry Blogs - United Kingdom http://approjects.co.za/?big=en-gb/industry/blog Thu, 04 May 2023 11:52:38 +0000 en-US hourly 1 Why GitHub Actions is invaluable for devs http://approjects.co.za/?big=en-gb/industry/blog/technetuk/2022/10/13/why-github-actions-is-invaluable-for-devs/ Thu, 13 Oct 2022 13:28:00 +0000 Cameron looks at GitHub Actions, which makes it easy to automate all your software workflows.

The post Why GitHub Actions is invaluable for devs appeared first on Microsoft Industry Blogs - United Kingdom.

]]>
A header showing the GitHub logo next to an illustration of Bit the Raccoon

Learn GitHub Actions. If you already are familiar with it, deepen your knowledge.

As programmers, we tend to focus on computing languages and perhaps the integrated development environment (IDE) we use. That makes sense, and there’s plenty in computing culture to push us in the direction of study of the latest syntax, or the best libraries and frameworks that apply to our work. Yes, dive into Selectors Level 4 and Jupyter and RxJava and Big Data Clusters. Those will always be fundamental.

You also need to experience for yourself the workflow automation that Actions offers. Think of it as programming, but on a higher level. You absolutely need the basics of your chosen technologies; at the same time, workflow automation helps you leverage that knowledge in a way that’s hard to understand until you try it for yourself.

Force multipliers

We’ll look at Actions in several different aspects, because it’s so easy to underestimate otherwise.

GitHub started as a rather passive repository: developers put source code there, and developers retrieved source code from it. GitHub offered new services through the years, including Pages to host websites based on stored source.

Actions activates all the previous capabilities, because Actions executes source. Most introductions to Actions present its CI/CD capabilities, and those indeed are important to the point of being indispensable for many development projects. CI/CD is almost a side effect, though, of Actions’ more general collaborative and computational capabilities.

Think for a moment about applications out in the real world. Amazing results are now feasible because what used to be top-level research projects in motion detection or natural-language processing or geographic information systems have been reduced to open-source libraries that anyone can incorporate. Actions helps focus some of that same power on our domain: source code. Source repositories no longer are passive dumps where we pour our source in, but increasingly active team members that automate more and more of the tedious work of programming.

And that is true because Actions isn’t just a tool, but a workplace for sharing best practices. Consider, say, security vulnerabilities. You probably recognise the benefits of automatic scanning of your source code to identify potential vulnerabilities. You might even recognise that Actions, like other CI/CD tools, is a great way to automate those scans to find problems long before they can affect your users. But there’s more! Actions also supports an entire marketplace of shareable executable pieces that embody best practices in this and related areas. You don’t have to reinvent common wheels for issuing notifications and tracking progress and reporting results. Others have already done this, and many of those achievements are available for your use.

Review stale issues that deserve closure? There’s an Action for that. Help newcomers set up an effective Go development environment? That’s in there. Customise localisation processes so your application speaks like a native? Actions can help.

You owe it to yourself to learn enough of Actions to handle as many of the administrative processes around development as possible. That will free you to turn your attention back to the programming you like, with the security that Actions is taking care of the other parts.

Look to the future

Learn the latest Actions news, because, along with all the benefits and features mentioned above, Actions is more and more a platform for collaboration by the distributed teams of the post-pandemic future.

Learn more

The post Why GitHub Actions is invaluable for devs appeared first on Microsoft Industry Blogs - United Kingdom.

]]>
It’s a great time for programmers to learn low-code http://approjects.co.za/?big=en-gb/industry/blog/technetuk/2022/07/21/its-a-great-time-for-programmers-to-learn-low-code/ http://approjects.co.za/?big=en-gb/industry/blog/technetuk/2022/07/21/its-a-great-time-for-programmers-to-learn-low-code/#comments Thu, 21 Jul 2022 15:09:00 +0000 Low-code/no-code techniques have enormous potential, particularly when bolstered by artificial intelligence (AI). Cameron takes a look at its history, and why it's a great time to learn.

The post It’s a great time for programmers to learn low-code appeared first on Microsoft Industry Blogs - United Kingdom.

]]>
An image of Earth, next to an illustration of Bit the Raccoon.

Low-code/no-code techniques have enormous potential, particularly when bolstered by artificial intelligence (AI). Several thousand pundits, thought leaders and consultants already have stories to tell on the theme that “even non-professionals can create apps in no time…”. This success story is the fruit of many interesting social and technological developments.

By the late 1950s, COBOL and FORTRAN (as they were then capitalised) promised that programs would be readable by non-specialists. “Readable” was the salient criterion for the time, of course; English could barely express the idea that people in business might render their own digital content by typing until roughly 1975. “Typist” and its variations continued to grow as a profession for about another 15 years.

Low-code long ago succeeded, in the sense that the majority of programming is already done by non-professionals. Am I serious? Yes: by any appropriate measure, the world’s most-used programming language is Spreadsheet, as implemented by Microsoft Excel, Google Sheets and their competitors. At least a billion people worldwide use spreadsheets to get results that would take years to arrive if they had to wait on the planet’s 25 million programmers. Low-code won, a long time ago.

This is a huge accomplishment that brings with it inevitable challenges. For example, several scientific researchers have independently analysed and published peer-reviewed scientific papers in genetics and clinical medicine, and accumulated evidence that between 30 percent and 92(!) percent of these have blatant errors traceable to common spreadsheet fumbles.

But these incidents testify to spreadsheets’ success, rather than failure. In each instance, non-specialists are producing results on their own that would have cost more, and taken more time, if they’d waited for professional programmers. As embarrassing as mass-market coverage makes the errors appear to be, they’re secondary to the actionable results that emerged from “end-user computing”. Non-specialists know little of the limits of the tools they use, and that’s as it should be.

Why programmers will remain in demand

In any case, programmers have the opportunity to work alongside citizen developers and extend on low-code. We can build and enhance apps, automate processes and create virtual agents more quickly, helping organisations innovate and achieve their objectives faster.

Low-code also offers opportunities for developers to learn powerful new skills that can advance your career, earn recognition from your peers and attract employers.

Of course, even when the most richly AI-ified no-code performs perfectly and an initial demonstration or roll-out succeeds, ‘base-layer’ coding skills will remain in demand. Apps will still require installation, need to be reconciled with local expectations for security certificates or localised text or multi-factor authentication, and so on. A programming mentality will play a crucial role for decades to come.

If “programmer” to you means someone who just churns out more of the same Java or CSS (or COBOL) learned at university, then, yes: low-code might well render such specimens extinct. If, however, you see professional software development as an adventure in life-long learning, low-code will be your friend. It’s a great time to learn low-code.

Learn more

The post It’s a great time for programmers to learn low-code appeared first on Microsoft Industry Blogs - United Kingdom.

]]>
http://approjects.co.za/?big=en-gb/industry/blog/technetuk/2022/07/21/its-a-great-time-for-programmers-to-learn-low-code/feed/ 1
Why developers need to look beyond code http://approjects.co.za/?big=en-gb/industry/blog/technetuk/2022/04/01/why-developers-need-to-look-beyond-code/ Fri, 01 Apr 2022 14:25:02 +0000 Cameron takes a look at how it's not just code and software that needs to update and change.

The post Why developers need to look beyond code appeared first on Microsoft Industry Blogs - United Kingdom.

]]>
An image of Earth, next to an illustration of Bit the Raccoon.

You already know that programming involves you in continuous learning. It’s easy to think of the objects of that learning as technologies: functional programming languages, key vaults, source scanners, and so on. Those are indeed important and valuable.

Our technologies aren’t the only things changing, though. We always apply them within a larger cultural context, and we’re under-trained to deal with those changes.

Things we know that aren’t so

As a warm-up, recall Kevin Deldycke’s list of “Awesome Falsehoods” for programmers, including: “My system will never have to deal with names from China”, “A time stamp of sufficient precision can safely be considered unique”, “My software is only used internally/locally, so I don’t have to worry about time-zones” and “Places have only one official address”.

Twenty years ago, naive applications assumed gender was an unremarkable, immutable, single-bit attribute; current interpretation of the 2010 Equality Act elevates such data design choices into the realm of legal consequences.

I don’t know what the world of even the near future will be, and I’m sceptical of those who say they do. A few trends are so potent and certain, though, that they deserve our attention before they surprise us.

Traditional banks aren’t set in stone

From their material architecture to their public relations style, banks make an impression of permanence and gravity. This is just a cultural convention, though. The rising generation in industrial countries goes about its business with less and less use of cash and cheques, and often without traditional banking at all.

In 2012, the World Bank wrote about the “problem” of the unbanked. It increasingly appears that many of the world’s young and poor – a majority of the world population still! – will meet their needs without ever joining the ranks of the “banked”. They don’t seem themselves losing the banking game; they aren’t even playing that game.

Credit cards may face similar changes. At the same time as global credit-card use continues to grow around 3 percent annually, such alternative practices as Buy Now Pay Later (BNPL) are exploding at a rate of over 22% per annum. While BNPL brings its own risks for consumers, vendors, and the larger economy, these factors may remain secondary in the face of marketplace adoption. If you’re a financial programmer, you need to learn not just Angular and continuous integration (CI), but also the BNPL vocabulary.

Super apps

These shifts in consumer finance complement the “Rise of the Super App“. For more and more people – perhaps a majority, before long – money lives in their mobile phones. We’re long past the times of money conceived as precious metal, money as tangible currency, or even money as figures on a full-sized computer screen. Behemoth computing platforms like WeChat, Facebook, and Snapchat are where consumers and businesses conduct their social and economic transactions.

Keep in mind the phenomenal growth rates involved. When usage of something like M-Pesa grows 50 percent or more each year, that necessarily means the average tenure of users is well under a year. Whatever habits or practices are accepted now might well become obsolete in just a few months with recruitment of a whole new generation of users.

Devs and demographics

Further complementing these shifts are the demographics involved. Coding is the fastest-growing employment sector in the UK, with private industry adding around 25,000 programmers a year.

India graduates about ten times as many computer scientists and software engineers annually.

Nigeria, Kenya, Egypt, and several more countries not often regarded as technology centres also appear to be growing at least tens of thousands of new programmers each year.

The point is not to fear these changes, but simply to recognise their reality. The consumers and the providers of the business applications of the coming years will have different backgrounds, expectations, and daily lifestyles from those who have dominated computing to this point.

If we, as developers, insist on just sticking to the “bits and bytes”, it might soon be a challenge to keep doing what we do. There are no guarantees that the specific jobs of today will continue. When we open ourselves up just a little to new end-user expectations and populations, though, whole new worlds of opportunity appear.

Learn more

The post Why developers need to look beyond code appeared first on Microsoft Industry Blogs - United Kingdom.

]]>
Themes in reliable computing  http://approjects.co.za/?big=en-gb/industry/blog/technetuk/2022/04/01/themes-in-reliable-computing/ Fri, 01 Apr 2022 14:24:45 +0000 Cameron takes a look at how trust in our computations is increasingly a requirement, and how commercial sectors can learn from those who think about it on a daily basis.

The post Themes in reliable computing  appeared first on Microsoft Industry Blogs - United Kingdom.

]]>
An image of Earth, next to an illustration of Bit the Raccoon.

Programming applications that truly must work – in medicine, transportation, factory automation, and so on – is different from the software development that most of us do on a daily basis. In general, no one dies or crashes or explodes when our pixels don’t line up or the supermarket’s quarterly sales total is centred rather than right-justified.

Trust in our computations is increasingly a requirement, though, and even those who develop for the relatively forgiving commercial sectors have lessons to learn from those who think about guarantees on a daily basis.

What’s reliable about computing?

Start with a few vocabulary clarifications. “Trust” in computing usually refers to such security considerations as authentication – are the users who they say they are? – and authorisation – are those same users actually permitted to do what they’re asking? “Reliable computing” generally is understood in terms of guarantees: these inputs will yield desired results every time, without exception.

Programmers usually classify a denial-of-service exploit, for instance, as a security problem, or possibly even a merely operational detail outside programmers’ responsibility. Consider this example, though: suppose an account holder should see in a particular circumstance, “Your current balance is £154.38.” As a result of an attack on another part of the system resulting in network congestion or the other usual causes, something goes wrong, and the best the application can offer is, “… try again later.” That’s a good message – certainly better than the esoteric tracebacks some retail applications still show on occasion. The application is correct. It meets its specifications.

Notice, though, that, for consumers, the application has become unreliable. At this point, consumers hope for a different answer, with no guarantee when it will arrive. The application is simultaneously correct, yet not reliable.

At an IT level, “reliability” has to do with blowing dust off motherboards, installation of antivirus software, hard-drive defragmentation and minimisation of plugins.

In some circles, “reliable computing” aims at the rather narrow branch of mathematics that analyses arithmetic results and how precision propagates through them. Most of us, most of the time, have the luxury of indifference about whether “1/5” displays as “0.20000000”, “0.20000001”, or even something else. For certain specialists, those differences are exactly the domain of “reliable computing”.

Some dialects of computing culture even swap the meanings of “trust” and “reliability”. The most important point in any teamwork is to agree lucid definitions of what you’re aiming at. For today, we have a general interest in the software dimensions of computing results and the factors that support or interfere with getting the right answers every time.

Tackling the hazards

The sheer volume of categories of interference surprises a newcomer to the field, including at least:

  • Security considerations: is this computer allowed to make this computation in this circumstance?
  • Platform health: are power, networking, security certificates, mass storage, and third-party libraries as available as designed?
  • Memory exhaustion: does the system have enough memory for stack and heap to work as intended? Who scrambles when they do not?
  • Deployment: how graceful is it to update the system? Is there a defined process for deployment, or is it left to the ingenuity of the operator? What criteria are in place for agreement that the system is back to “normal” after a restart?
  • Logging: suppose a system is performing perfectly, but the logger to which it reports periodically itself goes off-line. What should happen then?

The biggest countermeasures we’ve invented have to do with our tools. Rather than “artisanal C” on a repurposed operating system (OS), we’re more likely nowadays to program in a language which builds in memory management, to execute on a real-time OS, and to have both in-process memory scanners and OS-level dashboards. As a result, memory misuses are far less common today.

At least, I believe so, and I can find multiple other practitioners who echo this. At the same time, software engineering is barely into its first century of practice; it has been filled with intense change throughout, and we’ve collectively had little opportunity to make even such basic measurements as the frequency through time of memory faults. We’re just at the beginning of the research that can lead to any meaningful conclusions.

Explore and adapt best practice

What does make for reliable computing, then? To a large extent, you’ll need to research your own particular domain to answer that question well. Study what has worked with medical devices or space probes for ideas and practices you can test in your own situation, more than “silver bullets” that work magic. In general:

  • High-level languages are more expressive.
  • Testing probably helps.
  • Doubt everyone’s estimates.
  • Organisational dynamics trump geography.
  • Inspection probably helps.
  • “Good team communication” might be indispensable.

Reliable computing is a worthy goal. Its achievement seems to depend on more than just purchase of the right products or adoption of a particular methodology. Until we know more, assume you’ll need to customise the themes of reliable computing to the specific programming you do.

Learn more

The post Themes in reliable computing  appeared first on Microsoft Industry Blogs - United Kingdom.

]]>
Computing is at the centre of the biggest news of 2021 http://approjects.co.za/?big=en-gb/industry/blog/technetuk/2021/12/30/computing-is-at-the-centre-of-the-biggest-news-of-2021/ Thu, 30 Dec 2021 14:00:00 +0000 Cameron shines a light on how we adjust our workflows and "soft skills" to make the best way forward in a new world.

The post Computing is at the centre of the biggest news of 2021 appeared first on Microsoft Industry Blogs - United Kingdom.

]]>
An illustration depicting a modern workplace, next to an illustration of Bit the Raccoon.

The COVID-19 pandemic and our response to it remains humanity’s biggest event in 2021, with over three million deaths attributed to the disease and its immediate complications this year. Computing is deeply involved in our response: public-health tracking, epidemiological modelling, the extraordinarily rapid roll-out of effective and novel vaccines, and the automation technologies involved in production of monoclonal antibodies are all visible countermeasures, feasible only with the help of cheap and reliable computing.

As much as those technologies fascinate and motivate me, though, I see that true change operates at the cultural level, and computing in 2021 has quite a story to tell there, too.

Commercial society has transformed itself enormously, on the fly, in the last two years. Then, shifts of work, entertainment, and socialising from in-person to remote were the province of futurists. Now, they’re irreversible commonplaces: plenty of workers, for example, are going back to their offices either never or only on special occasions. The CEOs planning a return to “normality” are often just revealing that they feel safer in their personal comfort zone. Researchers have accumulated plenty of evidence by now that distributed workforces can be at least as creative and cohesive as conventional commute-sufferers. For better or worse, a majority of first-world romantic partners now meet each other online.

And digital technologies enabled all this! Plenty of mass-market articles already focus on the tools to use when working remotely. The aim of this posting, in contrast, is to shine a light on implications specifically for developers, and how we adjust our workflows and “soft skills” to make the best way forward in this new world. That story has only begun to be told – and it starts with our approach to collaboration.

Plan for asynchrony

The single biggest step you can make is to embrace asynchrony. Learn to work productively without the luxury of instantaneous responses. Help your collaborators keep moving forward, even in your absence.

Synchronous communication has its place. One of my own roles is to consult with teams that provide support to development projects. There, we have specific service-level expectations at the level of responding to questions like, “why is $SOME_HOST rejecting my ssh request?” within two minutes.

While we’ll return to that example in a moment, the need for rapid response applies in a minority of cases for software workers. For the most part, creatives like programmers, designers and architects do well to “microservice” their intellectual contributions to be loosely coupled. Figure out what your own best rhythms are; if you’re like most of the professionals I see, “day” or “hour” is more likely than “minute”. Work out expectations with your team. Maybe you’ve had the habit in the past of apologising when you take a call or a walk for thirty minutes; it’s time to leave those apologies behind as distractions. Train your colleagues that you’ll take care of their questions, but over a day or a week, rather than in a few seconds. Focus on being fully present when you do address a particular matter.

The centrality of git

The one central “hard” technology for the current shifts in programming culture is git. Its usual description as an “open source distributed version control system” underplays the extent to which git has become a platform for development activities of the world’s 25 million programmers as well as a growing number of non-programmers. This is how we program now:

  • Administration of version control repositories is increasingly in the hands of cloud vendors GitHub, GitLab, Azure DevOps Server, BitBucket and so on.
  • Collaborative workflows are based on “pull requests” and their reviews.
  • The CI/CD/CT fundamental to DevOps are increasingly constructed from building blocks within the frameworks of GitHub Actions, pre-commit, GitLab CI/CD and so on.
  • Where basic websites of the past were by default self-hosted, or built on Myspace or eventually on AWS virtual machines, now it’s often easiest to rely on GitHub Pages or similar.
  • GitHub and competitors offer project management tools such as ticketing systems which are increasingly potent.
  • Even such software as issue trackers, administrative configuration management databases, and effort reporters increasingly present themselves in terms of their integrations with git-based services.

Notice how many of these features complement distributed collaboration. Now and in the future, work gets done in the framework that git defines.

Recognition of git’s importance is strategic to your career. Thousands of providers promise to certify you in such languages as JavaScript and C# and so on. Fluency in computing languages is indeed indispensable to a programmer; however, even shallow exposure is frankly adequate for most of daily commercial work. Real-world programmers rarely spend their time in subtle algorithmic analysis and clever meta-programming syntheses. Instead, our biggest tasks are to integrate with available libraries and configure higher-order workflows. That’s more the domain of git.

At a concrete level, then, set yourself two big git-related goals for the coming year. While it’s always helpful to learn more about the programming languages you use and the full range of git’s specification, those kinds of studies now deserve to be secondary. You’ll gain the most leverage when you:

  • Read the blogs of the advocates for GitHub Actions, pre-commit, and other services built on top of git; and
  • Reshape your own team’s culture so that collaboration is as engaging and productive as possible. That probably entails some combination of healthy code reviews and at least some of the elements of swarm programming.

Progress in these areas, and your programming will feel entirely different over the next year. Not only will you be more productive in a straightforward sense, but you’ll be better aligned for the changes in work habits that are to come.

Rapid response

While the current premium is on learning asynchronous skills, specific situations for quick turn-around remain, including the support-centre work mentioned above. One current fashion is to apply “artificial intelligence” and aim to replace human teamwork with service by bots. That’s a subject for another day. An alternative you can undertake immediately, though, is to promote cultural values of (partial) automation and self-help services. Like other opportunities above, this is more about “soft” elements of attitude and practice than specific “hard” technologies.

Steps to take

As a programmer or related creative, you’re right now at the middle of upheaval operating at a global scale. This is the first time through it for all of us. Be wary of anyone who claims certainty. In all likelihood, though, this is a great occasion for you to concentrate on soft skills that promote effective teamwork, and to learn enough git that you can begin to take advantage of the remarkable assets now embedded in frameworks based on git.

Resources

The post Computing is at the centre of the biggest news of 2021 appeared first on Microsoft Industry Blogs - United Kingdom.

]]>