BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

A ‘Process’ For The AI Economy, Appian Weaves Richer Textures Into Data Fabric

Following

Low-code has elevated, pun intended. The rise of low-code application development technologies designed to perform software architecture and workflow process automation and acceleration has been hugely prevalent in the way modern (increasingly cloud-native) IT systems have been built and operated. Software platforms at this level now have an essential responsibility to work with data in a unified, secure, discoverable and optimized way - and this is a reality significantly amplified by the arrival of generative Artificial Intelligence (gen-AI) and its impact on business operations and human welfare.

That might sound like a grandiose exposition statement, but it encapsulates the methodologies Appian has grounded its platform development on and reflects its CEO’s rather exacting analysis of the business data landscape today.

Often regarded as a low-code software platform company, the organization now spans a wider purview encompassing process mining and unified information functions through the provision of its data fabric technology. Now some 25 years on from its initial formation (and still with the same four founding fathers on its board), the low-code Appian of yesterday is now more of a holistic data-centric organization seeking to explain how its Appian Data Fabric, ProcessHQ and other tools work as a means of underpinning the way responsible, functional and practical AI works in the near and immediate future.

What is a data fabric?

By way of explanation, Appian defines a data fabric as, “An architecture layer and toolset that connects data across disparate systems and creates a unified view. It is a virtualized data layer. That means users don’t need to migrate data from where it currently lives, say in a database, Enterprise Resource Planning (ERP), or Customer Relationship Management (CRM) application. The data may be on-premises, in a cloud service, or in multi-cloud environments.”

Also sometimes referred to as an ontology (i.e. the scientific study of how entities exist and how they are grouped), a data fabric is also described as a semantic later that can be used to represent the place, shape and value of corporate data so that it can be communicated with via a common dialect - and as if it were all local, even when some of it is dissagregated. As such, a data fabric is argued to be an ideal way to control the use of different information sources in the new era of AI.

Acutely aware of the need to enable users with what he calls ‘data rights’ in relation to the way enterprise and personal software is deployed and used today, Appian CEO and co-founder Matt Calkins has already called for something of an AI epiphany that needs to happen globally. This awakening would involve us grasping the realization that we need to move onward from the AI hype that suggests it is some ‘existential threat’ to humanity and start to think about data privacy issues and the need to lock down information streams more tightly so that we can focus on using AI to actually increase business (and indeed personal) productivity.

Having previously explained that AI itself is not just a Machine Learning (ML) algorithm tasked with whirring away on some cloud service in a datacenter somewhere, Calkins reminds us that AI, crucially, is also a data access and data synthesis mechanism too. The calmly spoken software engineer turned corporate leader is profoundly aware of what has been happening in the wider AI space, with leaders in this space often emerging as a result of sheer weight of influence.

It’s what Calkins calls ‘a monopoly not earned’ and he says that his vision for AI and data is not about beating the Turing Test (a measure of whether a machine’s ability to exhibit intelligent behavior is indistinguishable from that of a human), our focus with enterprise software at all levels should be about extracting opportunities for really increasing productivity.

Drawing upon the Appian tagline ‘Orchestrate Change’, does the Appian CEO feel that humans are about to switch from being workers to some new notion or ‘orchestrators’ that sit over AI engines as the machines do all the work?

Humans will be ‘more’ human

“That’s not quite what’s going to happen,” advised Calkins, speaking to press and analysts in person this April 2024. “The role of humans is actually going to become ‘more human’ as we continue to enter the AI economy i.e. we won’t have to spend time working on tasks that are essentially repetitive and rote, we will be able to put people at the center of process control so that AI becomes a member of the team, but the team in and of itself. But to communicate with AI (as a worker) we will need a common language and that part takes some work. Spoken human language are verbose and imprecise, so they're not a good medium for creating logical instructions. Instead, we're going to need an intuitive graphical representation as our gateway to elegantly handling AI workloads of all shapes and sizes.”

This level of validation helps describe why low-code is no longer ‘just’ low-code that can be used to knock out a new retail app a bit quicker; we’re now at a point where a platform like Appian has been engineered to provide end-to-end process automation. That means this technology should be used as more than ‘just a bot’ in Calkin’s terms i.e. it’s process and data control that spans Artificial Intelligence, Robotic Process Automation (RPA), Application Programming Interface (API) integration and an ability to dovetail and integrate all of the above with human workflow process down to the individual task level.

Deeper into business processes

All of this context brings us to where and how the company is now updating the latest version of the Appian Platform. The new release introduces Process HQ, a combination of process mining and generative AI unified with the Appian data fabric. Process HQ promises to provide visibility into business operations to enable data-driven decisions and process improvements. The latest version of Appian also extends the practical value of generative AI through enhancements to Appian AI Copilot and the prompt builder AI skill.

The company reminds us that business users need greater visibility into the full breadth of their enterprise data and processes in order to maximize operational efficiency and strategic decision-making. By combining the latest technologies in data fabric, process mining, machine learning and generative AI, Appian Process HQ is said to help monitor and improve every business process built on the Appian platform.

That’s an important point of clarification i.e. while Appian process technologies can be ‘pointed at’ software services and data resources that have been created on other platforms, the company’s deeper analytics and data control tools as showcased are more effectively applied when kept inside their own wheelhouse.

"Every organization wants to better understand their processes and find ways to improve them, but traditional process mining can be a daunting investment and doesn't always lead to actionable insights," says Michael Beckley, Appian’s CTO. "Thanks to Process HQ and Appian's unified process automation platform, we're streamlining the journey from insight to action with low upfront investment, deeper insights and the ability to rapidly act on improvements." Beckley further states that Appian Process HQ makes it easy to reduce costs, risks and delays, improve compliance and drive better business outcomes, without the need for costly and time-consuming data collection efforts.

Process HQ includes:

Inside the Process HQ outer wrapper we find Appian Process insights. This technology lets business users without a background in process mining or data science uncover insights and explore their business processes through an AI-powered analysis of their workflows. Process insights uses detailed audit information of both human and automated activity captured in Appian’s data fabric, providing visibility without a substantial effort. It uses AI to identify and quantify bottlenecks, errors and delays and provides intelligent recommendations for process areas with the most improvement potential. Users follow a guided experience to drill deeper into the details and can then quickly act on process improvements using Appian's process automation capabilities, all within a secure, enterprise-grade platform.

Also inside Appian Process we find Appian Data Fabric Insights, allowing business users to explore enterprise data and build custom reports and dashboards. When partnered with Appian AI Copilot, users can gain new insights faster. Data fabric insights makes report creation possible for business users without any Appian development knowledge and also allows them to answer common business questions faster, without needing to rely on a data expert or developer to build a report. The company claims that organizations can save significant time and money with these capabilities and can be confident that only the right users can view certain secure data.

Generative AI enhancements

A new gen-AI enhancement is also offered here in the form of Appian Prompt Builder AI skill. Business users can now not only create their own prompts, inputs and outputs, but they can also use generative AI prebuilt prompts for common use cases, including summarization, text generation, entity extraction etc. By presenting a curated list of common and suitable use cases, the prompt builder skill simplifies prompt generation, enabling users to start from a contextually relevant prompt and efficiently generate responses.

Appian AI Copilot is said to be able to ease some of the most tedious development tasks by generating sample data. Users can simply specify the desired number of rows and let the AI copilot handle the rest, generating data for individual records and for complex sets of related records. Ideal for unit testing, user acceptance testing and stakeholder demos, AI Copilot accelerates the development lifecycle while ensuring the availability of realistic data for testing and demonstration purposes.

“It also uses generative AI to enhance test case generation, addressing one of the most time-consuming tasks developers face by suggesting test cases aligned with users' business roles and ensuring comprehensive coverage and accurate execution of business logic,” notes Appian’s Ross at team, as part of the firm’s annual product update statements.

Fewer software engineers?

Will all this mean we end up with more software developers and more apps, or fewer engineers and a smaller number of overall applications? The answer, always, is more software developers and more applications and more data services. It’s also for sure that we’ll an increasing number of of software engineers using more AI-enriched optimizations and specifications.

The moves here from Appian see the company drive its platform forward to offer a higher level of orchestration and drive generative AI enhancements forward to create more meaningful process improvement and continuous optimization. It’s always a question of more developers, but now it’s also a question of more developers with a wider, bigger and sharper set of tools.

What might be most important here is the opportunity to consider Appian CEO Calkin’s world view opinions on what we should be concerning ourselves with in terms of data privacy and security controls so that we can take advantage of process and code automation functions to the full. As they say, with more automation comes great responsibility.

The road to eliminating subjectivity

Low-code is still in its ascendancy and experiencing high times in terms of adoption, extension and possibly a fair degree of hype too. As we now start to encapsulate more software automations into composable sometimes reusable blocks, let’s also take on and shoulder a commensurate amount of data control too.

Appian CEO Calkins is famously outspoken enough to say that AI is not yet smart enough to make human decisions all alone on its own; with the notion of a data fabric as a common semantic to make all enterprise information resources feel local wherever it is, it's a question of the difference between data access and data control.

Looking immediately ahead, we now need to orchestrate humans, machines, software bots and AI engines together through a process of elastic process mining and process management so that - as Appian co-founder and chief technology officer Michael Beckley puts it - we need to take all the subjectivity out of decisions so that they are based on data and facts. There’s a process behind how we use data for AI and that very process (and the microprocessor processing it drives) is rich in process intelligence.

Follow me on Twitter or LinkedIn