Skip page header and navigation

Blog posts

​Computer advancements like artificial intelligence (AI) and machine learning have the power to change, and often improve, all elements of society, but one application some may be less aware of is in film and music.

Old photograph of The Beatles, half is grainy and half has been restored

​From restoration projects to the very art of film writing and production itself, the rise of the algorithm continues to permeate all aspects of our lives.

Peter Jackson’s latest film project, the documentary The Beatles: Get Back, has the potential to be a watershed moment in the union between artificial intelligence, machine learning and the art of making films. In this blog, we look at the tech behind the project and investigate the growing world of artificial intelligence in film planning, production and marketing.

Get Back… in time

In May 1970, The Beatles released what would be their final film, Let It Be. It was a fly-on-the-wall documentary created during the writing and recording of the album of the same name, which would be released alongside it. It would mark their final studio release together (though not the last one they recorded – Abbey Road was recorded a few months later in 1969 and released before Let It Be).

During the filming of the documentary, which took place in Twickenham Film Studios and the Apple Records headquarters at Savile Row, around 60 hours of film footage and 150 hours of audio recording was captured. The original cut of Let It Be ran at a mere 80 minutes, meaning almost all of the available content remained unused and consequently unseen for over 50 years. That was until film director Peter Jackson’s project began in 2016 with the blessings of Paul McCartney, Ringo Starr and the widows of John Lennon and George Harrison.

Editing such a substantial amount of footage into three two and a half hour movies would prove challenging enough given the disorganised nature of the material, but the biggest challenge by far was restoring the delicate, half-a-century-old audio and visual tapes into crystal-clear quality ready for modern viewing.

Peter Jackson explains:

“In 1969 that film had a quite chunky, grainy desaturated look to it. One of the purposes was to try to restore it sort of making it look as natural as possible.”

The process yielded impressive results. Suddenly, the resulting video was not only cleaned of its graining, but also had bright, vibrant colours and much sharper imagery. Over 50 years of degradation was removed with a single, highly complex, algorithm.

The first process of film restoration is digitisation. This converts an analogue tape into a digital code. The algorithms used in film restoration are instructions that computers implement. The instructions specify when certain patterns of code occur – for example, a pattern of code that describes patches or scratches – that these should be replaced with patterns of code that describe an absence of scratches. The algorithm then additionally takes into account the patterns in the imagery around the scratch and mimics those, removing the imperfection.

There is a huge amount of knowledge and refinement across fields such as neural networks, deep learning, approximation theory, compression, signal processing and more that is required in order to make this work. Some film-based algorithms can be enormously complicated, particularly the ones where frames are missing altogether. This was a particularly prevalent issue in Peter Jackson’s last documentary release, the World War I feature They Shall Not Grow Old. During this project, his team restored 100+ year old footage shot on hand-crank cameras at somewhere between 13 and 17 frames per second – less than 50% of a standard movie frame rate. This accounts for the speed and jitteriness of old footage from that era when it’s played at a normal frame rate. In order to smooth it out and make it look more modern, the footage was put through a series of highly complex algorithms which ‘generated’ the missing frames based on the information in each existing frame. This enabled a steady 24 frames per second rate throughout.

Black and white photograph of soldiers in WW1 restored and coloured in

These two projects mark the beginning of what could be a whole new era of machine learning-aided film restoration.

As ‘Moving Image Archive News’ reports:

“Developers of automated film-restoration are using big data to computationally analyse large data sets to reveal pattens, trends and associations. The patterns created by pixels captured in exposed film are one variety of data set.”

All of this is in the aim of developing ‘one-click’ restoration in the future.

Within the Get Back project, the audio was equally, if not more, important to the presentation. The audio underwent a similar process to the video, as Jackson comments:

“To me the sound restoration is the most exciting thing. We made some huge breakthroughs in audio. We developed a machine learning system that we taught what a guitar sounds like, what a bass sounds like, what a voice sounds like. In fact, we taught the computer what John sounds like and what Paul sounds like. So, we can take these mono tracks and split up all the instruments we can just hear the vocals, the guitars. You see Ringo thumping the drums in the background, but you don’t hear the drums at all. That that allows us to remix it really cleanly.”

This process of de-mixing has yielded amazing results from an audio perspective. When the band was talking idly in the studio and strumming their guitars, the algorithm allowed the producers to strip the sound of the guitars away and focus in on the voices to hear hidden conversations, something which was previously impossible from a mono source.

The results of Jackson’s team are truly phenomenal and provide a look at one the most influential groups of musicians in history in a way that few thought was ever possible. Stream the documentary and see for yourself on Disney+ now.

This process was also utilised in the expanded and re-mixed version of The Beatles’ seminal 1966 album Revolver, released in October 2022. Giles Martin, son of original Beatles producer George, has stripped back the original master tapes to expose instruments and voices that were previously layered on top of one another on a single tape. This has allowed him to completely remix the album with a flexibility that was previously impossible, yielding incredibly impressive results. As he puts it:

“There’s no one who’s getting audio even close as to what Peter Jackson’s guys can do. The simplest way I can explain it: It’s like you giving me a cake, and then me going back to you about an hour later with flour, eggs, sugar – and all the ingredients to that cake haven’t got any cake mix left on them.”

Furthermore, in November 2023, this same technique was used as a final chapter for the band with the release of Now and Then, the ‘final Beatles song.’ A 1977 piano and vocal demo by John Lennon was put through Jackson’s AI technology to strip out the original vocal and clean it up, with Paul, Ringo and other musicians recording modern instrumentation to fill out the song.

The AI film director

We’ve seen how artificial intelligence and machine learning is helping make and restore films in a very practical sense, but what about the other applications for Hollywood? For the people who make money from movies, figuring out what will sell is, perhaps controversially in the world of film, of prime importance when deciding what to make.

Cinelytic, a Los Angeles-based startup, is just one of a multitude of companies who are trying to make artificial intelligence a key factor in film production. Fundamentally, it licenses historical film performance and cross-references this data with more information about the films’ themes, actors and more. It then uses machine learning to find trends in the data and allows film producers to make their own successful dream formula using accurate predictions about how the film will ultimately perform at the box office.

Considering one leading man over another? Wondering whether your film’s themes will chime with a paying audience? Cinelytic is just one of those companies reportedly helping you do that. ScriptBook, a company based in Belgium, claims that its algorithms can predict success merely by analysing its script, for example.

ScriptBook shared forecasts it made for movies released in 2017 and 2018 with The Verge, which suggests that their algorithms are doing a good job. In a sample of 50 films, including HereditaryReady Player One, and A Quiet Place, just under half made a profit, giving the industry a 44% accuracy rate. ScriptBook’s algorithms, by comparison, correctly guessed whether a film would make money 86% of the time, twice the accuracy rate that the industry achieved.

Another company, Vault, promises clients that it can predict which demographics will watch their films by tracking (among other things) how its trailers are received online. This has the potential to completely change the way films are marketed and released.

While naturally very clever, the issues arise from an artistic point of view. Do we really want films, and art in general, to be made based on sales predictions and humanity-free algorithms?

The beauty of Get Back is that you get to witness four human beings, all brilliant yet flawed, making brilliant yet flawed music in real time. Let it Be might not ultimately be their most critically well-received or commercially successful album (nowhere near in fact), but it remains a fascinating historical document of a band having fun yet growing apart and starting to consider a life without each other.

The question this presents to us is: in the end, do we want to engineer a world where such cold algorithmic calculation would potentially lead to this album and film having never been made?

Embracing a technological future

We believe in the transformative power of data and technology and want to play our part in creating a smarter, safer, better-performing world, by connecting organisations and talent. We’re on a mission to bring the tech community together and run virtual and physical networking events to create meaningful connections across the tech space.

We aim to add value to the right people, in the right places. We place our clients and candidates first, priding ourselves on long term, pro-active relationships. We deliver on our promises, if we don’t know the answer, we’ll be honest and find someone who does.

As a business and a team, we practice what we preach. We give our people the same flexibility, strong culture, purpose, and positive working environment that we recommend our clients should offer in their organisations. This ensures we attract the best consultants, who thrive in an environment that nurtures their ambitions and who provide the best level of service to our candidates and clients alike.

We pride ourselves on being a clear, educated voice, in an often crowded and competitive market. Get in touch with Andy Wadsworth (andy.wadsworth@morson.com) to find out how our insight and experience can help your teach organisation thrive in an ever-changing industry.

Search our latest jobs in tech and artificial intelligence here