In the bustling heart of Hollywood, the green lighting process remains a testament to the industry's unique blend of art and business. Drawing from an array of considerations, such as the potency of the narrative, the allure of the attached talent, and prudent budgeting, studios often turn to "comps" or comparable films to guide their decisions. These comps are essentially references to past successes that bear thematic, stylistic, or market similarities to the project at hand. Take, for instance, the greenlighting of "The Hangover" based on the surprise successes of previous R-rated comedies. While the reliance on historical patterns and industry intuition has seen many a blockbuster born, the inherent unpredictability of audience reactions means that, functionally, greenlighting is an educated guess at best. Even with the most rigorous analyses, studios can never truly be certain of a film's reception.
But what if you could?
Silicon Valley, the tech industry's pulsating core, introduces a markedly different approach to predicting success: the concept of "Product-Market Fit." Here, startups and innovators float a minimum viable product (MVP) – a basic iteration of their vision, trimmed of frills. The MVP's primary purpose is to gather tangible, quantitative feedback from real-world users. Dropbox, for instance, initially launched a simple video showcasing its idea without building the entire infrastructure. The overwhelming interest that followed was clear evidence of a fit between the product and the market's needs. This methodology, anchored in real-time feedback and concrete data, allows companies to adjust, refine, or pivot based on concrete demand, minimizing the realm of guesswork.
The history of testing before you invest - a history of the “Lean Startup”
The Lean Startup methodology is a modern approach to business development that focuses on rapidly iterating product concepts based on real-world feedback, in order to reduce waste and increase the chances of creating a product that meets market demand. It was conceived by Eric Ries, who introduced the concept to the wider world in his groundbreaking book titled "The Lean Startup," published on September 13, 2011.
HISTORY TO REMEMBER. It’s notable that Lean Startup came out during the same time as the rise of Y Combinator and SAFE agreements - which are two other milestones that changed Silicon Valley and could be applied to Hollywood - read more about this here.
The methodology consists of a cyclical process that includes the following key steps:
Build: This step involves developing a "Minimum Viable Product" (MVP). An MVP is a version of the product that allows the team to start the learning process with the least amount of effort. It's not necessarily a smaller or cheaper version of the final product, but rather the simplest version that allows the startup to start the iterative testing and learning loops.
Measure: Once the MVP is launched, the next step is to measure its performance in the real world. This involves collecting data on how users interact with the product, what they like, what they dislike, and how well the product is meeting its intended objectives.
Learn: Based on the data collected, the startup then learns whether to persevere with the current course or to make a "pivot", a structured course correction to test a new hypothesis about the product, strategy, and engine of growth.
The methodology emphasizes the importance of adapting and adjusting before any large sums of money or time are invested. By following these steps and continuously cycling through the Build-Measure-Learn loop, startups can ensure that they're not wasting resources on building products that no one wants.
Lean Startup before it was called Lean Startup
Before the widespread adoption of the Lean Startup methodology, Powerset, founded in 2005 by yours truly (with Barney Pell and Lorenzo Thione), was already embodying its principles. From its inception, Powerset stood as one of the most ambitious artificial intelligence, machine learning, and natural language processing projects globally comprising scientists and engineers from the Stanford Research Institute, Xerox PARC, and NASA. My company's core claim and value proposition was profound: we can beat Google - because a natural language understanding of the web was superior to a mere keyword-based approach.
To validate this hypothesis, we built an innovative prototype. This prototype juxtaposed Google's search results against Powerset's, all stripped of branding to ensure unbiased user feedback. Users would enter their search queries, compare the results from both engines side by side, and select which they preferred. After numerous searches, they could finally see which engine they favored more—Powerset or Google.
Long story short we kicked Google’s butt.
Peter Thiel, (who also funded Facebook which was across the street from us) tried the prototype firsthand, was immediately convinced of its potential, and promptly invested $800K in Powerset - our seed funding. Over the next two years, we continually iterated on our technology, our MVPs, and raised money at each stage of validation. We expanded the scope of our searches and increased our test user base, even leveraging Amazon's Mechanical Turk service (note: Jeff Bezos also invested) to gauge the preferences of hundreds of thousands of users.
Once we felt like we were ready, our initial product launch had a specific focus—it only searched Wikipedia. But even with this limited scope, its success was undeniable. Recognizing the transformative potential of our technology and the value it could bring to their own search ambitions, Microsoft acquired Powerset in 2008 for approximately $100 million.
This acquisition ultimately led to the birth of Microsoft Bing.
A THEME TO REMEMBER. Iterative building, testing, measuring, and validating are crucial for efficient capital deployment. This method ensures product development is data-driven, minimizing capital risks. When paired with funding milestones at each validation stage, it ensures resources are invested in promising endeavors, optimizing success chances in innovation-driven landscapes.
Learning from building in cycles.
Another fundamental principle underscoring product-market fit and the MVP strategy that I’d like to touch on draws inspiration from an engineering perspective for iterative development: start by constructing the smallest possible form of a complete product and then iterate creating slightly larger and more complex versions of complete products until you arrive at your “mostly” complete product many cycles later. Thus a common turn of phrase is build a unicycle, evolve to a bicycle, transition to a tricycle, and only then, after iterative refinements, do you approach building a bus. This stepwise progression emphasizes the importance of evolving a product based on continuous feedback and understanding from the market. In contrast, the riskier "bet the farm" approach—seeing a pitch and then diving headfirst into building a bus without prior validation—can not only drain resources but also misjudge market desires.
In startups, you never build a space shuttle from scratch - you may only need a scooter.
The iterative model's superiority lies in its flexibility. With each version, from unicycle to bus, the product is shaped and reshaped by real-world feedback, ensuring that it aligns more closely with market needs and expectations. This reduces waste, both in terms of resources and time, and maximizes the chances of achieving genuine product-market fit. By evolving a product in iterations, companies can craft solutions that genuinely resonate with their audience, instead of risking vast resources on potentially misplaced assumptions.
Could this be applied to films and television shows?
In the past, the dynamic methodologies of product-market fit, MVP, and iterative development, which became pillars for the tech industry's triumphs, seemed elusive for the realms of film and television.
What is a unicycle for a movie? it’s the smallest possible product that can be tested with perspective customers — and that’s not a script because a script can’t really be tested with audiences - it would be a trailer, or 10 minutes of a movie.
At the heart of this impasse is an undeniable challenge: exorbitant costs. Crafting an MVP for a movie or TV series, something beyond a script pass, to assess product-market fit is often deemed financially prohibitive. Unlike a digital software prototype, the intricate realm of filmmaking demands extensive crews, cutting-edge equipment, location scouting, meticulous post-production, and substantial capital, even for a rudimentary pilot episode or a condensed film. Given these towering demands, studios, only armed with a script and a few commits from actors, invariably lean on instinct, comprehensive research, and historical benchmarks over actual audience feedback on a tangible prototype.
But times are changing.
The revolutionary convergence of artificial intelligence with the SMURF stack—comprising Screenplays, Modeling and Environment Building, Unreal Engine, Rendering, and Final Cut—has ushered in transformative possibilities for the entertainment industry. Picture a world where directors employ AI-powered and open-source software to craft a foundational 'unicycle MVP' of their envisioned film or television show— fast and cheap.
This MVP would manifest as a trailer, a brief film sequence, or an introductory episode of a series (something that cold be tested with audiences) executed on a shoestring budget, perhaps a mere $50K, and within a condensed timeframe, such as 15 weeks.
Upon creating these MVPs, studios could roll them out to selective audiences or on specific platforms, deriving genuine, immediate feedback. Should these MVPs strike a chord, studios could engage in iterative refinements, evolving their creations. And then, fortified by tangible, data-driven insights, confidently greenlight expansive production ventures. This symbiotic fusion of the tech sector's iterative principles with Hollywood's rich narrative legacy could set the stage for an invigorated era of entertainment—minimizing financial uncertainties and generating content that resonates deeply with viewers.
SMURF STACK DEFINED. The SMURF stack represents a cutting-edge framework set to redefine the film and television industry by integrating technology and creative processes. Constituting the acronym "SMURF" are the following components: Screenplays (with AI-driven tools assisting in the crafting of intricate narratives and shot lists); Modeling and Environment Building (utilizing open-source platforms and asset libraries to create realistic models and immersive settings); Unreal Engine (pioneering the realm of virtual production to blur the lines between pre and post-production); Rendering (leveraging AI-enhanced tools for faster and more efficient visualization of scenes); and Final Cut (using advanced editing suites that incorporate AI to optimize the post-production narrative flow). Collectively, these elements are poised to democratize and revolutionize filmmaking. If you’d like to read about the SMURF stack in detail, feel free to read my “Got SMURF” issue from The Brief.
ACTIONABLE STEPS. So exactly how would you do this?
The Unicycle - a Still Image Trailer: use AI to generate a detailed storyboard with blocked shots, dialogue, music, and foley. (2-3 weeks)
Utilize platforms like Midjourney to develop intricate concept art for characters, scenes, and environments to “seed” the art for consistency.
Employ Midjourney to create a storyboard, with detailed shot lists exemplars with vibrant still images for each shot in the shot list, leveraging the "seed" from the initial concept art studies.
Create the auditory dimension of the trailer by synthesizing dialogue using Eleven Labs' AI voice generator to ensure accurate character tones and inflections.
Integrate music from trailer stem libraries to set the narrative and emotional tone. Adjust the chosen music to align with the pacing and beats of the visual scenes.
Enhance realism by integrating foley sounds, sourced from libraries like Boom, to add depth and authenticity to the trailer's environment.
Once completed, this unicycle MVP can be showcased to writers and producers, providing them with a clearer understanding of the project's direction, tone, and style. Moreover, it can be presented to select test audiences, capturing their initial reactions and gathering insights about what resonates with them.
Armed with this feedback, the director is afforded the opportunity to refine and adjust the unicycle. By continuously iterating based on the collected feedback, the director ensures that the unicycle evolves into a more polished and compelling version, setting the foundation for the subsequent stages of production.
The Bicycle - a Moving Picture Trailer: Take the still images and turn them into a previz quality trailer using AI. (1 week)
Using the shot list images from Midjourney as input mechanisms into RunwayML, generate short video clips for each scene.
Upon completing the bicycle stage, a director is now armed with a dynamic visual tool that not only captures the essence of the envisioned project but also serves as an enticing magnet for industry talent. This visual representation, richer in detail and narrative depth, can be instrumental in courting and captivating actors, showcasing the project's potential and providing them with a tangible glimpse of the film's atmosphere and tone. Similarly, it aids in drawing the attention of experienced editors, cinematographers, and other crucial team members. Their expertise and unique perspectives can elevate the project's quality and vision. In essence, the bicycle becomes a pivotal tool in assembling the dream team, laying the groundwork and gathering momentum for the forthcoming tricycle stage, where the project begins to take a more complete, collaborative, and cinematic shape.
The Tricycle - Insert Talent: Begin by integrating the cast and crew that have been acquired for the project. (3-5 weeks once talent is committed)
Replace Generic AI Actor Stills: In the initial unicycle, substitute the AI-generated stills of generic actors with AI-generated stills of the actors who have committed (and who have given written permission) to the project.
Collaborate with Cinematographers and Editors: Rework and refine scenes, blocking, and shot list based on inputs and expertise from newly recruited cinematographers and editors.
Create a New Unicycle: Utilizing the reworked storyboard and shot list, generate a revised set of still images.
Transition to a New Bicycle: Use the newly created storyboard stills as inputs in Runway ML to develop a moving picture representation of the project.
Integrate Authentic Actor Dialogue: Substitute the AI-generated dialogue with recordings of the actors who have committed, ensuring the dialogue resonates with their unique voice and intonation.
Audience Testing: Showcase the revised project to test audiences, gauging reactions and gathering feedback.
Iterate: Make necessary refinements based on feedback, iterating continuously until the tricycle phase reaches its desired completeness and quality.
By the culmination of the Tricycle phase, the director is not only equipped with a refined representation of the envisioned film but also fortified with audience data that hints at its potential success. This data serves as empirical evidence, a testament to the project's viability and its resonance with viewers. With such compelling insights in hand, the director is now in a robust position to approach financiers and investors, pitching not merely with passion but with proven promise. It's not just about seeking development funds anymore; it's about securing "Seed" funding, a term borrowed from the startup world. This Seed funding, much like in entrepreneurial ventures, is the crucial capital that allows a promising idea to blossom into a full-fledged production, backed by both artistic vision and market validation.
The Quad - SMURF: Utilize Unreal Engine and the Virtual Stage to build out a high fidelity previz trailer. (8-12 weeks)
Once Seed funding is secured, typically around the ballpark of $2 million, a director can transition from initial prototypes to the creation of high-fidelity trailers and scene shorts, leveraging the unparalleled capabilities of platforms like the Unreal Engine and harnessing the potential of virtual stages. However, while this influx of capital unlocks new possibilities, budgetary prudence remains paramount. Being "SMURF aware" is essential during this phase, guiding directors to strike a delicate balance between vision and viability.
This means making judicious compromises. Instead of investing heavily in building custom 3D models from the ground up, directors might lean on established resources like Kitbash3D, tapping into their vast array of pre-built models. Similarly, for populating scenes with crowds or background characters, rather than custom-designing each character, they can resort to prebuilt character libraries from companies such as Big Medium Small —see some examples of this below in the Go Down the Rabbit Hole section at the end.
This phase signifies a shift away from the early, low-fidelity visuals punctuated by the artifacts and uncanny qualities sometimes produced by Runway ML. The aim now is to immerse the project in the realm of high-fidelity CGI, achieving a level of realism and detail that captivates viewers and investors alike. The culmination of this meticulous work leads to a pivotal presentation, where the director showcases the refined vision, hoping to secure the much-coveted Green Light and embark on the journey of full-scale production.
Testing Product Market Fit.
The "Quad" stage signifies a transformative moment in the evolution of a film project. With a meticulously crafted, high-fidelity prototype available, studios are primed to undertake rigorous audience testing. By presenting the Quad to a diverse spectrum of demographics, studios can collect comprehensive feedback from different age brackets, cultures, and backgrounds. This initiative mirrors the tech industry's approach to Product Market Fit, allowing studios to gauge how various audience segments connect with the film's narrative, aesthetics, and thematic undertones. By meticulously analyzing this feedback, studios can determine if the film aligns harmoniously with the expectations and inclinations of its intended audience, effectively assessing its market viability. This rich pool of data not only suggests potential areas of refinement but also forecasts the project's prospective reception and success. Consequently, studios are better equipped to navigate their next steps, making informed decisions anchored in audience insights as they inch closer to green lighting the final production.
A THEME TO REMEMBER. Traditional “Comp” methods in film Greenlighting often result in a predictable and homogenized slate of content. In contrast, the Product Market Fit approach opens the door for testing a wider array of concepts, paving the way for innovative and potentially groundbreaking films to secure funding.
Big Benefits for Writers and Directors
Utilizing the "Product Market Fit" approach, especially when integrated with AI, offers a significant advantage over the traditional "Comp" (Comparable) method prevalent in the film industry. The inherent limitation of the Comp method is its reliance on past successes and known paradigms. By consistently drawing parallels with existing content, the Comp method inadvertently steers new projects towards familiar territories. This often results in a homogenization of content, with studios and filmmakers championing what is known to work, sometimes at the expense of novel, groundbreaking ideas.
In stark contrast, the Product Market Fit model champions exploration and innovation. It allows for any concept, no matter how avant-garde or unconventional, to be tested with real audiences. Filmmakers can assess genuine audience receptivity to fresh narratives and ideas, free from the constraints of past benchmarks. As a result, unique and pioneering movie ideas, which might have been sidelined in a traditional Comp-driven assessment, can be rigorously tested. By leveraging Product Market Fit, the film industry stands the chance of ushering in an era of richer diversity in storytelling, breaking free from the repetitiveness that often plagues mainstream cinema.
Big Benefits for Studios
Utilizing the Product Market Fit and iterative MVP methodologies could profoundly reshape the decision-making landscape for studios. By channeling funds in stages based on solid "proof" from MVP validations and real-time audience reactions, studios can sidestep the pitfalls of large financial gambles based on mere speculation. A notable example is the film "John Carter." With a budget soaring over $250 million and significant marketing costs, it became one of the industry's most notable box office disappointments. Had the studio employed this new approach, early audience feedback might have flagged concerns, prompting necessary adjustments or even reconsidering the project's viability altogether. This strategy not only mitigates risk but also opens doors to a wider array of innovative and diverse ideas without straining the budget.
A QUESTION. With every evolution in technology comes a question - can we maintain our essence? In this case AI can not only help us maintain our essence, it can help filmmakers prove that breakthrough ideas are worth funding.
As Hollywood navigates the complexities of integrating AI into its storied processes, it's essential to recognize that the advent of this technology isn't a harbinger of doom for the industry. In fact, AI presents a silver lining. By facilitating the creation of a "rough draft" of films, AI provides studios with an invaluable tool to determine the feasibility and potential success of a project before committing substantial resources. This judicious application of AI allows for the exploration of a broader spectrum of stories, showcasing our diverse human experiences more vividly. Beyond expanding our narrative horizons, this methodology aligns with optimizing financial strategies, ensuring that the cherished human touch, with its inherent costs, is reserved for stories that truly resonate with audiences.
Go Down the Rabbit Hole
Below is a collection of trailers and pitches crafted using AI, the SMURF stack, or a blend of both. Often, these are the works of VFX artists aspiring to venture into filmmaking, which sometimes reflects in a slightly underdeveloped narrative. Yet, I'm optimistic that in the near future, directors and writers will adopt these methodologies, collaborating with VFX artists to weave stories that are both captivating and impactful.
The EYE: Calenthek Trailer - Created in just six weeks with Unreal Engine 5, MetaHuman Creators, and a small team of artists.
Irradiation - a group of men enter the forrest to find a strange and dangerous phenomenon.
The Making of Irradiation - this trailer was made using character kits from Big Medium Small.
Big Medium Small - instead of crafting custom characters from scratch, filmmakers can get it immediately - check out the kits for yourself.
Whispered Shadows - a theatrical trailer created completely with AI (Midjourney and Runway ML)
Genesis - a theatrical trailer created completely with AI (Midjourney and Runway ML)
Diablo Immortal Trailer - it is still uncanny valley and it looks like a game - but it’s getting closer.
Good Hunting - an Unreal Engine short film created by Project Metamorph.art, which aims to provide UE-native AAA cinematic-quality characters for the the filmmaking community.
Cloud Racer - Set in the near future, a young man and his blue-collar mechanic father compete in a qualifier for an advanced race craft competition, through a treacherous course agains elite racers that runs through the now “Ghost City” of Los Angeles.
Past Issues of The Brief
Got SMURF - The introduction of the SMURF stack, a group of open source free tools that will change the way films and television shows are funded and produced opening a path to creating a Y Combinator for Hollywood.
Will AI Eat my Job? - A deep dive into how technological epochs have affects jobs and economic systems plus a walkthrough of exactly what jobs will be affected in Hollywood and how the functional and know how will change over the next 20 years.
Digital Sets - Historical comparison of the car industry juxtaposed against the history of set making in Hollywood - from live action on location, to virtual production, and finally a world where everything is digital. What this means for filmmakers, writers, actors, and producers.
Will Actors be Replaced by AI? - An addendum to Will AI Eat my Job, this issue delves into the impacts of artificial intelligence and CGI on actors. How will AI and digital characters be introduced into each genre of films and television shows and what does this mean for established and aspiring actors.