Unbuilding Frankenstein: designing publishing platforms for an automated, AI-driven future

Unbuilding Frankenstein: designing publishing platforms for an automated, AI-driven future

“If I cannot inspire love, I will cause fear!” 

- Frankenstein, Mary Shelley, 1818

This month marks the 200th anniversary of Mary Shelley’s famous tale of Dr Frankenstein and his artificial ‘fiend.’

We know the plot. It’s a creation myth that tells us to ‘avoid ambition’ when it comes to the creation of new versions of ourselves. Same with Terminator and Skynet, and countless other versions of similar tales over time.

Welcome to the new age of machine learning and artificial intelligence. When we consider AI and its promise, Frankenstein is where today’s headlines are at. We’re being told that the robot fiends are coming and that the future may be bleak. It’s a powerful story that quickly concentrates our minds on a very specific set of outcomes.

But whilst these narratives raise some very definite challenges and ethical considerations, they also cast a shadow over the real barriers to progress and delivery of AI-driven solutions.

A recent 2017 article in Venturebeat sums up the current challenge nicely:

"The systems and tooling that helped usher in the current era of practical machine learning are ill-suited to power future generations of the intelligent applications they spawned."

In other words, if we want to innovate and deliver new services with new means of automation - and if we want to influence the broader AI playing field of tomorrow - then we need to change the way we design our platforms today. And that means book publishers and booksellers too.

Changing the platform

The most common recent development focus has been on extending the value of website-driven processes - such as content management, caching, UI and approaches to project management - to an increasingly diverse set of user requirements and endpoints. Whatever has worked for the creation and consumption of websites is preserved and extended for the creation and consumption of new environments.

Delivery mechanisms and the movement of data are more or less hardwired from data source (the back end) to application or service (front end), based on a clear understanding of the required user experience - a design principle which spawned a whole methodology that we know well today as ‘UX.’ Once the user experience is defined, data is extracted and integrated via a variety of tools to make it fit for presentation in front end environments.

Whilst perfect for the creation of websites (and a mobile version of those exact same sites), the sun begins to set on this approach as our technology environment evolves. Processes, applications and tools that were designed for one mode of delivery get stretched to support new modes - and with each change to the new operating requirement, the overall system becomes a lot more complex and fragile.

Innovation makes stuff wobble

This can be illustrated via the following waves of innovation:

#1: Mobile

The arrival of mobile as the dominant device - along with a new set of presentation requirements - causes our platforms to creak a little. Mobile apps are usually hybrid - using data that’s part native, part web-based, and an application framework that tends to go off reservation, requiring some code to sit outside of our master schemas. Productivity gains from standard platform components like Content Management Systems and Customer Relationship Management Systems are reduced as bits of their functionality get hived off or re-compiled to deliver the best possible user experience in the new environment.

For example: a new mobile application that enables readers to discover new content, books and articles will have to rely on a lot of shortcuts if the necessary data sources haven’t been built into the system to begin with.  

#2: IoT (Internet of Things)

With the introduction of new devices, such as home assistants, smartwatches and connected consoles (in-car, on-fridge, etc), the blueprint breaks down - as it becomes increasingly impossible to deliver a website-driven process in a meaningful way to a variety of form factors that may not even have a screen.

For example: it’s going to be hard for a customer to order a new book or learn about that book using an Alexa or Google Home device if our core content and data isn’t easily repurposable for non-text-based inputs.

#3: AI and automation

In the past 12-18 months, Machine Learning, Natural Language Processing, Graph Databases and a range of supporting tools and technologies - things that were once the preserve of academia or the big tech vendors - have made their way into the world and are now (thanks to a wealth of public libraries and resources) readily available for primetime application development.

For example: we already have the tools to create solutions that are fully personalised to us and our content habits - enabling the type of service where our ‘next great read’ is suggestested for us, by the machine, because it knows us so well.

Build for data, data by design

The trouble is, our current platforms can’t support the promise of this kind of automation, because the way they’re wired don’t support the main cornerstone of AI: a smart data store where all of the tools and processes (graph technology, machine learning, natural language) that breath intelligence and automation into our next generation services can be applied.

As mentioned, raw content and data currently tends to be extracted and integrated with a predefined set of endpoints in mind - for, say, a catalogue-based website.

This way of building things won’t support the promise of an AI-driven future because the new user experiences we are looking to design (natural language assistants, automated and personalised recommendation engines, etc) need to be defined and delivered via software rather than hand-cranked code.

When it comes to the delivery of new automated services, just like the shifts we’ve experienced in mobile, we’ve started to hit a limit on what we can do with our system components. Like Dr Frankenstein, we’re trying to build our new environments using a bunch of old body parts, often in the wrong order. (And we know this approach doesn’t end well!)

Chatbots may come and go, but our next generation platforms must be data-driven by design. And all our development efforts ought to be geared to the creation and future growth of our data assets because they will be our single most valuable asset of the future.  

This is where all of our system intelligence will ultimately reside: ready for reuse across many different environments, formats, users and devices.