While it may not be a new thing to say anymore, the efficient and intelligent use of data has the ability to transform public sector organisations. It can reduce costs, enhance revenue, optimise processes, and improve customer/user experience. Put simply, data gives people the right information to make better decisions - which is why we continue to focus so heavily on empowering the public sector to get the most out of their data.
As arguably the most powerful data-powered tool we have, artificial intelligence (AI) can offer more accurate insight and automate the decision making process to save time and money on a tremendous scale - e.g. last week we laid out how it’s use within rapid prototyping of innovative analytics use cases can bring huge value relative to the time and resources needed to get started.
As mentioned in relation to rapid prototyping, these kinds of advanced analytics should become a core component of business-as-usual (BAU) activity for public sector organisations as they look to deliver better outcomes through better use of their data. The challenge is they can be expensive and require highly technical skills which are not always present at all levels across the public sector. Similar to rapid prototyping, Low-code/No-code (LCNC) offers a lower cost and quicker-to-execute way of doing advanced analytics. LCNC for software development is nothing new, but its use in AI is not all that common - yet!
LCNC AI offers an easy-use interface for deploying machine learning models and analytics without the need to write code - offering the ability for non-data scientists to create insight at the click of a button…or two. This is especially relevant in the public sector, where data science skills gaps exist. Having the option to upskill staff in the use of LCNC AI can present a great opportunity for embedding AI use within ministries and departments.
However, LCNC is by no means perfect. There is still a level of specificity that needs to be included, so the use of data scientists will not be completely removed. There will be nuances and issues with data that require the keen eye of a data scientist to make sure that any model being implemented is robust and accurate. But make no mistake, it does greatly reduce the reliance on data scientists and creates a light-touch version of data science for use by product teams. LCNC AI can empower staff within an organisation to deploy more advanced analytics whilst freeing up data scientists to spend their time on bigger impact business problems along with enabling them to be more innovative and experimental rather than spending time on repetitive tasks.
In the future, AI may be able to autonomously create ML from an organisation's data - but until then there will be a need to have human input regarding the situational awareness of the model. In particular, the use case must be created with the SME knowledge of the business and its operations.
LCNC platforms generally work in one of two ways: 1. a drag-and-drop interface, where users choose the elements and models they want to include in their application and push them to a visualisation layer, or; 2. through a wizard, where users can answer simple questions and use drop-downs to build their models within an application. Different tools work in different ways, so it’s important to understand your organisation’s needs and experience with development to ensure you get the most out of any tool.
Naturally, Big Tech has invested heavily in these types of platforms in the past 10 years and already provide some useful cloud-native add-ons for easy-use. Below we’ve taken a look at a few examples of tools which are currently available and provided a bit of framing around what you can expect out of each:
Amazon SageMaker JumpStart offers state-of-the-art, built-in foundation models for use cases such as content writing, code generation, question answering, copywriting, summarisation, classification, information retrieval, and more.
This isn’t for the absolute beginner. Don’t expect to plug some data in and press go. There are features for fine-tuning that can be tricky to navigate for non-data scientists. But this does allow a level of automation to the modelling.
Google AutoML is also not for beginners. An understanding of ML is recommended. Google boasts: “AutoML enables developers with limited machine learning expertise to train high-quality models specific to their business needs. Build your own custom machine learning model in minutes.”
There is a great repository of managed algorithms for the most common use cases which can be easily deployed. Again, this is not something to be used by someone brand new to data science, but it would help data scientists do their jobs quicker.
A simple tool for training image recognition algorithms. Microsoft developed Lobe to enable users to build simple models with automatic model selection based on user workload. No coding is needed.
The user interfaces and language can be tricky to navigate as a total novice user. An awareness of what image recognition is and does is important.
At this point in time, the best thing public sector organisations can do is seek opportunities for automating elements of data science, rather than automating it fully. The benefits of using LCNC lie in the prototyping and innovating space.
At PUBLIC, we have developed tested methodologies to help organisations define their use cases for quick and effective test-and-learn using LCNC platforms. We can assist in either getting your data teams set up and trained in use of these platforms, or running the rapid prototyping ourselves to deliver the results into your organisations without the need for you to rely on data science skills internally.
LCNC has the potential to revolutionise the way data science is performed in the public sector. Currently - for the most part - innovation in intelligence delivery is blocked through lack of skill set and time. LCNC puts advanced analytics ability into the hands of teams who need it quickly!
As per usual with AI, the supposed benefits are not fully realised yet. At this point, these tools are not advanced enough to be fully adopted by the lay-person. The user-interfaces are still filled with data science language and in most models there is still a need for fine-tuning activities and model interpretation. So data scientists need not worry about losing their jobs - on the contrary their jobs can actually become a lot easier through the use of these tools. That’s where the real benefit of the LCNC platform comes in. Data scientists can speed up their model deployment using automated algorithm repositories within the platforms.
There are still other blockers to success, and they are the same blockers we have seen in the public sector over the past 10-15 years. Access to good data being top of the list. Architecture challenges blocking data unification, lack of good APIs blocking optimised data pipelines, and the right people to create value-driven use cases are all important challenges to overcome for most organisations.
Although most algorithms still require some data science support to be deployed in a robust manner, some are already in use without data science knowledge; LLMs being a great example of this. Simple user interfaces and well crafted APIs to create dedicated software for a single model is the key to the effective use by anyone. Most LCNC platforms on the market are a selection box of complex models and not all can be treated the same in terms of fine-tuning and productionised deployment.
More of these technologies will emerge over the next few years and early adopters will reap the benefits of being able to deliver ML use cases and improve countless departmental outcomes. When combined with an organisational increase in data literacy, departments can give themselves a massive boost in capabilities that can make meaningful changes to their internal operations as well as the public services they deliver.
If you’re looking to identify where ML can bring value to your organisation to locate the best use cases to take forward, reach out for a chat with Thomas Chalk or Mahlet Yared from our Data & AI (DAI) Team to get some expert insight. You can get your organisation up and running and deploying ML on LCNC platforms much quicker than you think!
Over the next few weeks, we’ll be sharing more of our ideas, perspectives and approaches that inform the work we are doing across our fast-growing Data & AI (DAI) practice. We’re keen to spark new, meaningful discussions and co-develop novel ideas around these topics, so please do engage with our team along the way.
Discover how PUBLIC is advancing the UK Government use of Generative AI, and explore our latest blog for insights on bridging the implementation gap and offering practical pathways for responsible AI adoption.
PUBLIC and Amazon Web Services (AWS) team up to host AI in the Public Sector Showcase to drive the adoption of AI-powered solutions to improve government services.
Check out our latest interview on driving AI adoption in the UK public sector with Thomas Chalk. Gain insights into how we can guide responsible, impactful AI adoption across government services.
The private sector often outpaces government in digital innovation and adoption. This has bred a concept known as the private-public ‘digital divide,’ whereby governments fall behind in providing users with the seamless digital experiences they’ve come to expect. The advent of generative AI threatens to exacerbate this divide.
How can public sector organisations rapidly prototype Machine Learning (ML) to get feedback fast and outcomes that last?
How is PUBLIC working with Data & AI to bring solutions - rather than hype - to the public sector?
In order to achieve a truly data-driven public sector we must embed a 'data culture' across all departments to complement data skills training.
In celebration of International Open Data Day, our latest blog highlights four distinct opportunities where we see open data making practical impacts on public services - championing case studies of excellence for each.
In this blog, we expand upon the emerging theme of data and AI, and highlight several key insights from the Summit, such as the importance of instilling a sense of trust in government use of data and AI to unlock the full potential of these technologies.
In this blog, we outline several case studies of data being effectively collected, integrated, and analysed to equip governments with new tools to address our current crises around climate, health systems, security and the economy.
A One Health approach to health security is needed more than ever. In this blog, we explore the definition of One Health, why it is an invaluable concept for today’s interconnected world, and the practical challenges to implementing a ‘One Health’ approach. We also propose 4 initial steps for national governments to take to counter the global threats we all face.
The transformation of supply chain data sharing across sectors is crucial for international trade that is digitised, frictionless, and secure.