Interviews Tim Davis Interviews Tim Davis

Interview with Alejandro Cremades

An interview with Alejandro Cremades about raising $130 Million to build a next-generation AI Platform to simplify AI development and deployment after multiple startups and acaling AI at Google. You can read the full blog post here, and its embedded below.

Multiple startups, scaling AI at Google, and now raised $130 million to build a next-generation AI platform

Tim Davis's entrepreneurial journey reflects a rare blend of intellectual curiosity, scrappy resilience, and a deep commitment to building relationships and networks. He has an exciting story, which includes an acquihire by Google and the launch of his latest venture, Modular.

Tim’s career trajectory started from gaming on a Commodore 64 and progressed to founding a food delivery startup before Uber Eats. His story isn’t just about pivots and products; it offers valuable lessons in adapting, grit, and navigating ecosystems on both sides of the Pacific.

In a riveting interview on the Dealmakers Podcast, Tim discussed hypergrowth companies, working at a large tech giant like Google, and raising an impressive $130M for Modular. 

Growing Up in Melbourne - The Early Years

Tim was born and raised in Melbourne, Australia, in the 1980s and 1990s. His mom was an artist, and his dad was a banker. Along with his older brother, Tim became an avid gamer, playing and honing his skills at Boulder Dash, Maniac Mansion, and Prince of Persia.

Soon, the brothers were modding many of the games to alter their files and change their behavior, appearance, or even introduce entirely new content. They would also find hacks for the games, programming in BASIC, and eventually, moved to Windows OS, which users preferred in Australia. 

Tim remembers moving quickly to Railway Tycoon and Doom, which fueled his love for technology and computer systems. He also had a keen interest in puzzles and math throughout school. But the path from childhood gamer to Silicon Valley founder wasn’t linear.

Early Curiosity and a Winding Educational Road

Tim’s academic journey is one of the most eclectic you’ll find. He studied chemical engineering and microbiology, then added commerce, mathematics, an MBA, and even a JD to the mix. “I kept flipping a series of interests,” he says.

Tim recalls how he actually ended up finishing specific parts of the course earlier, but lost interest in fluid mechanics, which was a large part of chemical engineering. He progressed to learning actual science and finance, even exploring the possibility of becoming an investment banker. 
But this seemingly scattered path wasn’t aimless; it was a quest for purpose. Despite internships in law firms and banks, Tim found himself disillusioned by the rigid, hierarchical growth trajectories in corporate Australia. 
“Even if you were a rockstar, you had to wait your turn. That felt strange.” The clarity Tim needed came from understanding what he didn’t want–to be stuck in a 9-to-5. “Life’s short,” he says. “While you're young, you’ve got time to explore different things. Why not take a shot?”

Now that he is older and has a family, time has become the most scarce resource available. Tim comments wryly that you don’t have an appreciation for that when you're younger. He was keen on building an interesting business, reasoning that, worst case, he could always lean on his top-tier education. 

As Tim sees it, growing up in Australia gave him the added advantage of a government-backed education, which didn’t saddle him with a massive student debt, unlike in many other countries. 

Entering the Startup Arena: Image Recognition and Fundraising Lessons

Tim’s first foray into startups came while studying patent and trademark law. “I was fascinated by how much effort went into innovation,” he says. Since he had studied computer science and technology, he was inspired to build a business around them. 

That fascination led to an image recognition startup focused on identifying branded content inside photos, an idea rooted in intellectual property logic. Tim began to think about designing a way for brands to find themselves inside the images and monetize them directly.

But the Australian fundraising ecosystem at the time was harsh. “Angels would offer $100K for 20-30% of your company,” Tim explains, a model fundamentally incompatible with scaling. “We realized, if we really want to do this, we need to go to the US.”

Tim saw that eventually they would have to raise several more funding rounds, for the company, CrowdSend, to become profitable. 

But he would have given away a massive percentage to someone who hadn’t fundamentally contributed a considerable amount of capital versus the risk Tim was taking

Moving to the US and Landing in Silicon Valley

In 2012, Tim boarded a one-way flight to Silicon Valley, landing in a hacker house he had found on Airbnb, which was run by a YC founder whose startup had failed. It turned out to be a blessing. The house, a de facto college dorm for ambitious misfits, became the spark for his second startup

At the time, Tim didn’t know much about Silicon Valley, but interacting with the talented folk at the hacker house, he discovered what an incredible place it was. It had lots of entrepreneurs from around the world. 

Arriving in the US, Tim decided against pursuing the image recognition business and instead started a new company with a co-founder he had met, Francisco Magdaleno.

The Early Hustle, Fluc Inc: The Pre-Uber Eats Era Food Delivery Idea

What started as a scrappy food delivery idea among housemates quickly turned into a fully functioning business. Tim built the front end and back end, and his co-founder developed the iOS app. 

Before they knew it, they were pioneering one of the first food delivery apps just as DoorDash was launching in stealth mode as “Palo Alto Delivery.” 

But signing up restaurants was painful. “They laughed at us,” Tim remembers, but he was very confident that their prototype was working well. 

Inspired by Grubhub’s financial statements and business model, which primarily focused on sales and marketing, they added restaurants without permission and increased menu prices by 10% to 20%. The hack worked. Consumers wanted selection. That was the unlock.

The two cofounders continued coding remotely while sorting out their visa situation. They also brought in a third cofounder from the US, Adam Ahmad. When they launched Fluc, it gained popularity quickly since there were no other options in the market that added every restaurant. 

The service exploded in Stanford, Palo Alto, and Mountain View. By 2014, Tim and Francisco were doing millions in top-line revenue. But the margins were brutal. “Food delivery is a horrendous margin business,” Tim admits. 

Back in 2014-2015, the environment was particularly challenging for the on-demand economy. Legal uncertainties about whether drivers were contractors or employees spooked investors, making further fundraising difficult.

Google Steps In: A New Chapter

Amid rising legal complexity and capital challenges, Google entered the picture. The company wasn’t interested in the food delivery business, but they were very interested in the team. Google conducted interviews with the startup’s team members. 

“It wasn’t a formal acquihire,” Tim notes. “They just wanted the people they thought were talented.” Tim and a few others joined Google, while some team members were not selected. What followed was a seven-year stint at one of the most prestigious technology companies.

At the time, Google was scaling a business called Google Express for North American folks. The company was not dissimilar to Instacart, where they were working with merchants to essentially scale the end state delivery.

The Google Years: Culture Shock and Product Execution

Going from startup life to Google was a seismic shift, and Tim picked up important lessons. He developed an understanding of how startups worked, as well as building and assembling things. 

Completing design reviews, product reviews, and engineering reviews, while learning how to create a strong product, was also part of his experience. 

Tim also learned organizational discipline and product execution, gaining valuable exposure to some of the world’s most brilliant minds. The melting pot of diversity and talent density impressed and inspired him.

“In startups, we worked from 7 a.m. to midnight. My first day at Google, people left at 4:30 p.m.,” Tim recalls. While the relaxed pace was jarring, Tim soaked up the best aspects of big tech. After a year in Ads, he moved to Google Brain, the elite AI research unit, which is now Google DeepMind.

It was 2017, before the AI boom, but Tim was hooked. “Being around the world’s best in AI was something you just couldn’t get anywhere else.” Although he had been exposed to recommendation systems inside ads in his logistics startup, deep learning was a new paradigm.

Here, Tim met his future co-founder, Chris Lattner, the creator of the legendary Swift programming language. Tens of millions of developers use this language today, and it drives most of the iOS ecosystem.

The Power of Hyper Networks

Tim emphasizes that the mentorship network he built at Google, particularly at Brain, became one of his most valuable assets. “That core group at Brain, many are now leading the next wave of AI startups like Character.ai, Adept, and others.”

What makes a hyper network? According to Tim, it’s not just about connections, but shared experience. “You work on hard problems with talented people. That trust and track record becomes the foundation for your next venture.”

The Next Chapter: Modular and AI’s Supercycle

Even while at Google back in 2016 to 2018, Tim could see that AI was going to have a massive impact on the world. Much of Google's internal technology was years ahead of the world, and he could see the possibilities. 

Examining the product landscape, Tim noted that NVIDIA owned most of the compute that powers the world’s AI, even in its early stages in 2018-2019. At the time, Google had built its infrastructure called TPUs. 

Tim had a background in product development, marketing, design, and sales, while Chris is a world-renowned engineer. “We asked ourselves—what if there was an open, universal abstraction layer for AI workloads?” Tim explains. 

The vision was a platform where developers could define their model, budget, and latency needs without caring about what hardware ran it, essentially making AI compute truly portable and efficient.

This idea formed the foundation of Modular. It wasn’t a small bet; it was a deep-tech infrastructure play that would take years to build. But the potential to decentralize AI hardware dependency and optimize performance across environments made it one worth pursuing.

Business Model: Scaling with Compute

Modular’s revenue model is tightly tied to usage. “We scale with the amount of compute that flows through our platform,” Tim says. Similar to how Databricks charges based on compute units, Modular’s customers pay based on the volume of AI workloads run.

For enterprises with on-premise deployments, the model shifts to a per-GPU pricing structure. Modular integrates seamlessly with environments like Kubernetes, allowing large-scale AI training or inference across private infrastructure.

Additionally, Modular has embraced cloud partnerships as a key growth channel. Working with providers to embed Modular into their offerings allows the company to monetize through distribution partnerships.

Tim likens this approach to the early Microsoft-Intel alliance or Databricks' partnership with Azure. As he sees it, channel partnerships are an excellent way to get strong distribution. It’s an exciting area that they have also been utilizing. 

A Different Approach to Fundraising

With over $130M raised, including backing from Google, Tim has gained a new perspective on how to raise capital effectively. His key takeaway? Skip the pitch deck and start with a memo, explaining why it was a significant opportunity for the world.

Storytelling is everything that Tim Davis was able to master. The key is capturing the essence of what you are doing in 15 to 20 slides. For a winning deck, take a look at the pitch deck template created by Peter Thiel, Silicon Valley legend (see it here) https://startupfundraising.com/pitch-guide  where the most critical slides are highlighted.

“In our seed round, we didn’t use slides,” Tim explains. “We wrote a three-to-four-page memo explaining what made this opportunity and us as founders uniquely compelling.” The Amazon-style narrative was met with enthusiasm from investors, who appreciated the clarity and depth.

This written approach also enabled more meaningful conversations. “Instead of starting from scratch in meetings, investors came in prepared with thoughtful questions. We’d go straight to whiteboarding,” Tim says. 

The contrast to his first fundraising experience, where pitch decks led to polite but empty rejections, was stark. The memo strategy wasn’t a one-off. Tim and Chris used it again for Modular’s $100M round.

They supplemented it with detailed papers on AI trends, compute economics, and developer growth. The result? Describing things in written form led to deeper discussions and faster alignment with the right investors.

Writing as a Cultural Backbone

Tim strongly encourages people to write their plans in a two-page memo, describing everything they think they can do, why they are uniquely positioned to do it, and why someone should give them the capital over all the competitors.

A two-page memo is more visionary and proves that the project deserves backing. It pushes founders to think deeply since they have only two pages to raise capital on. It also helps gain clarity on what they are building,

Modular’s documentation-first philosophy isn’t just for fundraising. Tim has embedded it into the company’s operating cadence. “We ask everyone to write down decisions using a simple problem-solving framework,” he shares.

The framework is composed of five core questions:

  1. What problem are we solving?

  2. Why is now the right time?

  3. What does success look like?

  4. What alternatives have been considered?

  5. What’s the recommended course of action?

“It’s amazing how this clarity either makes a path obvious or sparks a productive debate,” Tim explains. Whether it’s a hiring decision, a product strategy, or a sales play, the same structure applies. This culture of structured thinking has become central to Modular’s execution engine.

Tim says that he just wants to see a document that briefly outlines the simple architecture. It’s a succinct framework to help drive organizational and product decision-making across companies.

On Mentorship, Networks, and Helping Others

Tim is deeply aware of how relationships have shaped his journey from crashing at a hacker house in his early days to meeting his Modular co-founder at Google Brain. Now, he tries to pay that forward.

“I came to the U.S. from Australia not knowing anyone,” Tim says. “People bet on me, and everything meaningful in Silicon Valley is about how you treat people and the relationships you build.” For founders navigating similar transitions, Tim is generous with his time. 

“If I can help someone get a leg up, I try to. I know how hard it is to land in a new country and try to build something important.”

Final Reflections

Tim Davis’ story is one of navigation across continents, careers, and paradigms. From early missteps in chemical engineering to startup chaos and enterprise calm at Google, he’s assembled a unique blend of technical fluency, legal insight, and network leverage. 

What ties it all together is a founder’s mindset: curious, bold, and constantly evolving. As Tim puts it, “You can’t always plan the path, but you can keep showing up, keep building, and make sure you’re surrounded by the best.”

After years at Google and a successful startup exit, Tim Davis wasn’t just interested in launching another venture; he aimed to reshape how AI workloads are deployed and scaled fundamentally. 

Together with Chris Lattner, the legendary engineer behind the Swift programming language, Tim co-founded Modular, a company building what they see as a missing layer in the AI ecosystem: a hardware-agnostic infrastructure for machine learning workloads.

Listen https://alejandrocremades.com/tim-davis/ to the full podcast episode to know more, including: 

  • Tim Davis's unconventional education and early passion for gaming laid the foundation for a bold entrepreneurial path across industries and continents.

  • His first startup experience taught him the hard lessons of equity, risk, and the limitations of the Australian fundraising ecosystem.

  • Moving to Silicon Valley transformed his trajectory, exposing him to global talent, scrappy startup culture, and ultimately Google’s scale.

  • Google Brain became a pivotal experience, where Tim gained deep exposure to AI and formed critical relationships that fueled his next venture.

  • With Modular, Tim is building a hardware-agnostic platform for AI compute, aiming to decentralize and optimize AI infrastructure.

  • His fundraising strategy, centered on narrative memos instead of pitch decks, has helped raise over $130M and foster deeper investor alignment.

  • Tim’s focus on structured thinking, writing culture, and mentorship reflects his belief in clarity, networks, and paying it forward.

Read More
Interviews Tim Davis Interviews Tim Davis

Fund/Build/Scale Podcast Interview

Leaving a high-paying role at Google to take on NVIDIA, Intel, and AMD is not for the faint of heart, but that’s exactly what Tim Davis, co-founder and president of Modular, did.

I had a wonderful conversion with Walter Thompson about Modular some time ago, we covered AI compute, building a new AI software stack, the ups and downs of startups and scaling AI into the future.

Introduction

Leaving a high-paying role at Google to take on NVIDIA, Intel, and AMD is not for the faint of heart, but that’s exactly what Tim Davis, co-founder and president of Modular, did.

In this episode of Fund/Build/Scale, Tim explains why Modular has raised $130M to reimagine AI compute infrastructure, and what he’s learned trying to build a platform that competes with some of the biggest names in tech.

We talked about:

  • 🚀 Why Modular believes AI workloads need a hardware-agnostic execution platform

  • 💡 How Tim and co-founder Chris Lattner decided to “start from the hardest part of the stack”

  • 💰 The trade-offs of raising VC for infrastructure-heavy startups

  • 🛠 Why Modular focuses on talent density and how they’ve recruited top engineers from Google, NVIDIA, and beyond

  • 🌍 What it takes to break into the AI space when your competitors are trillion-dollar companies

Tim takes a thoughtful, deep-dive approach to this conversation—unpacking the complexities of AI infrastructure and what it takes to build in one of tech’s most competitive spaces. There’s valuable insight here for founders navigating technical markets or aiming to disrupt entrenched players.

Episode Breakdown

(1:26) “We are building a new accelerated execution platform for compute.”

(6:41) “ It will exist all over the place and it already does, but AI will be everywhere that compute is.”

(11:18) “ You only you only have so much time in a week. What is the thing that you're best at?”

(15:13) “ We have decided to start from the hardest part of the software stack.”

(22:44) “For the most talented people in the world, the risk is actually not as great as what you think.”

(30:24) “ Growing up in Australia, my view of the of the United States was very much driven from the media and from Hollywood.”

(33:26) “ I sat in a room for six weeks and just met everyone that I could. And that really was the beginning of a journey to the United States.”

(37:48) “ I still think there's a special place in the Bay Area, and in the United States, there is a different risk appetite.”

(40:41) The one question Tim would have to ask the CEO before he’d take a job at someone else’s early-stage startup.

Read More
Interviews Tim Davis Interviews Tim Davis

The Aussie conquering Silicon Valley

Meet the little-known young Australian in Silicon Valley at the forefront of the artificial intelligence revolution, who heads one of the hottest start-ups in America that has the audacious goal of fixing AI infrastructure for the world’s software and hardware developers.

By JOHN STENSHOLT (The Australian Business Review)

After years of toughing it out and watching his Silicon Valley dream almost die, a little-known former Melbourne NAB analyst now leads one of the hottest start-ups in America.

Meet the little-known young Australian in Silicon Valley at the forefront of the artificial intelligence revolution, who heads one of the hottest start-ups in America that has the audacious goal of fixing AI infrastructure for the world’s software and hardware developers.

Tim Davis, 40, turned up in California on a whim a little over a decade ago, survived a stint in a wild house known as the “Hacker Fortress”, started an early version of an online food delivery service before the rise of UberEats and others, was poached by Google only to leave the technology giant in 2022 to co-found Modular, an AI infrastructure start-up recently valued at about US$600 million (A$927 million).

And he says he is only getting started.

In his first Australian media interview, Davis says the goals for Modular are clear – and big.
“We started Modular to improve AI infrastructure for the world. Changing the world is never easy – but we are incredibly determined to do so,” he said.

“AI is so important to the future of humanity, and we feel a great purpose to try to improve AI’s usability, scalability, portability and accessibility for developers and enterprises around the world.”

Modular’s vision is to allow AI technology to be used by anyone, anywhere and it is creating a developer infrastructure platform that enables more developers around the world to deploy AI faster, and across more hardware.

Davis, and his America co-founder Chris Lattner, have helped build and scale much of the AI software infrastructure that powers workloads at some of the world’s largest tech companies – including Google – but they argue this software has many shortcomings and was designed for research and not for scaling AI across the vast number of new uses and hardware the world is demanding now and into the future.

They are aiming to rebuild and unify AI software infrastructure, solving fragmentation issues that make it difficult for developers who work outside the world’s largest companies to build, deploy and scale AI, and ultimately make AI more accessible to everyone.

“We thought we could build something unique that could actually empower the world to move faster with AI, while equally making it more accessible to developers, make it easier to program in and better from a cost standpoint because you can scale it to different types of hardware.”

Modular claims to have built the world’s fastest AI inference engine – software that enables AI programs to run and scale to millions of people – and its own programming language, Mojo, a superset of Python (the world’s most popular programming language) which enables developers to deploy their AI programs tens of thousands of times faster, reduce costs and make it more simple to deploy AI around the world.

After raising US$30 million from investors last year, Modular recently raised another US$100 million in a funding round led by private equity firm General Catalyst and including Google Ventures, and Silicon Valley-based venture funds SV Angel, Greylock Partners and Factory.

Modular says it’s now more than 120,000 developers using its products—such as its inference engine and Mojo—launched in early May—and that “leading tech companies” are already using its infrastructure, with 35,000 on the waitlist.

The US$100 million raised will be used on product expansion, hardware support and the expansion of Mojo, as well as building the sales and commercialisation aspects of the business.

Early years

All of which is a far cry from when Davis flew to Silicon Valley in September 2012 with dreams of becoming an entrepreneur, following stints working as a financial analyst at National Australia Bank and then several internships at law firms like Allens and Minters after completing law, business and commerce degrees at Melbourne’s Monash University.

Davis had started his own company called CrowdSend in 2011, which had a software system for identifying objects in images and pictures and matching them to retailers, but said he found it “difficult” as “investors in Australia, if you wanted to raise some seed capital, they would take a very large percentage of the business.”

“So it was my wife that said if you want to do this (become an entrepreneur) why don’t you go do the place where technology is. We’d actually just gotten married and then off I went to America.”

Davis used Airbnb to find a place to stay in Silicon Valley, gave himself six weeks to make a success of things and found what was called the Hacker Fortress in the Los Altos Hills.

“It was very much a place with a whole bunch of misfits who had come to America, particularly Silicon Valley, to do a start-up. This was a very big house, it had 15 rooms, and my dream at the time was how to make CrowdSend successful. There were a lot of really talented people there,” he says.

“(But) it turns out when you come to Silicon Valley, and you come with a preconceived notion of what you want to do, well then there’s reality of what you ended up doing.”

There were plenty of issues in the Hacker Fortress, which was not as clean as advertised on Airbnb. At one stage, a housemate died in his room. Davis would also have the roof fall in on his room one evening.

The house, despite being advertised as being only 10 minutes from the likes of Google and Apple, was actually quite a way from shops and restaurants.

“We basically had no ability to get food easily and so we had this idea that maybe we could build this large scalable distribution model,” Davis explains. “At the time the likes of GrubHub and these other businesses were going to restaurants and trying to get commission deals. So instead we reverse engineered the Starbucks and Chipotle menus and built an app and threw it out as an idea to people in the house.”

Before UberEats

The idea for what would become Davis’s next business, Fluc (Food Lovers United Co), was born – all because he and his housemates were a little lazy and really didn’t want to cook.

It was 2013, a year before UberEats was launched, and while other food delivery service apps were dealing directly with restaurants, Davis and his took a different approach.

“We just thought, why don’t we just grab every restaurant menu, we just put it on our website, inflate the prices, and start selling food? Overnight, we went from five restaurants to 160 restaurants. And it just exploded. It was unprecedented. And what we tapped into really was that selection mattered. And that’s what consumers had not had, at least in the American market. And so then we went on this journey of raising capital.”

Fluc would mainly service the local Bay Area market of about 8 million people, though it tried to move to Los Angeles at one stage, and raised US$4 million from local angel investors.

Davis says after two and a half years of working seven days a week it became obvious “that we wouldn’t be able to raise the amount of capital that we needed to scale that business” at a time when the likes of DoorDash were raising hundreds of millions of dollars annually from investors.

Davis and his team then started talking to other companies about potentially being acquired, and eventually Google expressed interest not in the company but the people Fluc had assembled.

That led to Davis joining Google, where he worked as a product manager and then joined the ad division where he built machine learning systems and the Google Brain research division dedicated to AI – where he met Lattner.

What followed, Davis says, was six years of building AI systems that stood them in good stead when they decided to leave Google and form Modular.

“We were basically there for all the major components of AI as it is now … and what was interesting is that through that experience we could see the infrastructure was designed by researchers – a bunch of people who wanted to train large machine learning models. But taking those models from a research environment and actually scaling them into very large production systems is very hard.”

He says Modular’s only goal is to find solutions to that problem, and commercialise it and that the opportunity “we have in this market is astronomically huge.”

What’s next

“We work with everyone from high performance racing teams, to autonomous car companies, to very large machine learning recommendations to generative AI. You name it, whether it’s video generation, image generation, text generation, we’re there.

“You could go down the NASDAQ list of companies (to find those) who want to use our infrastructure to scale AI inside their organisations.”

As for widespread concerns about AI, Davis says the fact that AI systems take an “incredible amount of time to build” means that the public should be “realistic” about the perception that AI could be a threat to humanity.

“We definitely have better recommendation systems, we have better chat bots, we have translation across multiple languages, we can take better photos, we are now all better copywriters, and we have voice assistants.

“But none of this is remotely close to AGI or artificial general intelligence – or anywhere near ASI: artificial super intelligence. We have a long way to go.

“AI shouldn’t be seen as a replacement for human intelligence, it’s a way to augment human life – a way to improve our world, to leave it better than we found it.”

Read More
Interviews Tim Davis Interviews Tim Davis

Unite AI – Interview Series

Tim Davis, is the Co-Founder & President of Modular, an integrated, composable suite of tools that simplifies your AI infrastructure so your team can develop, deploy, and innovate faster. Modular is best known for developing Mojo, a new programming language that bridges the gap between research and production by combining the best of Python with systems and metaprogramming.

By Antoine Tardif (Unite AI)

Tim Davis, is the Co-Founder & President of Modular, an integrated, composable suite of tools that simplifies your AI infrastructure so your team can develop, deploy, and innovate faster. Modular is best known for developing Mojo, a new programming language that bridges the gap between research and production by combining the best of Python with systems and metaprogramming.

Repeat Entrepreneur and Product Leader. Tim helped build, found and scale large parts of Google's AI infrastructure at Google Brain and Core Systems from APIs (TensorFlow), Compilers (XLA & MLIR) and runtimes for server (CPU/GPU/TPU) and TF Lite (Mobile/Micro/Web), Android ML & NNAPI, large model infrastructure & OSS for billions of users and devices. Loves running, building and scaling products to help people, and the world.

When did you initially discover coding, and what attracted you to it?

As a kid growing up in Australia, my dad brought home a Commodore 64C and gaming was what got me hooked – Boulder Dash, Maniac Mansion, Double Dragon – what a time to be alive. That computer introduced me to BASIC and hacking around with that was my first real introduction to programming. Things got more intense through High School and University where I used more traditional static languages for engineering courses, and over time I even dabbled all the way up to Javascript and VBA, before settling on Python for the vast majority of programming as the language of data science and AI. I wrote a bunch of code in my earlier startups but these days, of course, I utilize Mojo and the toolchain we have created around it.

For over 5 years you worked at Google as Senior Product Manager and Group Product Leader, where you helped to scale large parts of Google's AI infrastructure at Google Brain. What did you learn from this experience?

People are what build world-changing technologies and products, and it is a devoted group of people bound by a larger vision that brings them to the world. Google is an incredible company, with amazing people, and I was fortunate to meet and work with many of the brightest minds in AI years ago when I moved to join the Brain team. The greatest lessons I learnt were to always focus on the user and progressively disclose complexity, to empower users to tell their unique stories to the world like fixing the Greater Barrier Reef or helping people like Jason the Drummer, and to attract and assemble a diverse mix of people to drive towards a common goal. In a massive company of very smart and talented people, this is much harder than you can imagine. Reflecting on my time there, it’s always the people you worked with that are truly memorable. I will always look back fondly and appreciate that many people took risks on me, and I’m enormously thankful they did, as many of those risks encouraged me to be a better leader and person, to dive deep and truly understand AI systems. It truly made me realize the profound power AI has to impact the world, and this was the very reason I had the inspiration and courage to leave and co-found Modular.

Can you share the genesis story behind Modular?

Chris and I met at Google and shipped many influential technologies that have significantly impacted the world of AI today. However, we felt AI was being held back by overly complex and fragmented infrastructure that we witnessed first hand deploying large workloads to billions of users. We were motivated by a desire to accelerate the impact of AI on the world by lifting the industry towards production-quality AI software so we, as a global society, can have a greater impact on how we live. One can’t help but wonder how many problems AI can help solve, how many illnesses cured, how much more productive we can become as a species, to further our existence for future generations, by increasing the penetration of this incredible technology.

Having worked together for years on large scale critical AI infrastructure – we saw the enormous developer pain first hand – “why can’t things just work”? For the world to adopt and discover the enormous transformative nature of AI, we need software and developer infrastructure that scales from research to production, and is highly accessible. This will enable us to unlock the next way of scientific discoveries – of which AI will be critical – and is a grand engineering challenge. With this motivating background, we developed an intrinsic belief that we could set out to build a new approach for AI infrastructure, and empower developers everywhere to use AI to help make the world a better place. We are also very fortunate to have many people join us on this journey, and we have the world's best AI infrastructure team as a result.

Can you discuss how the Mojo programming language was initially built for your own team?

Modular’s vision is to enable AI to be used by anyone, anywhere. Everything we do at Modular is focused on that goal, and we walk backwards from that in the way we build out our products and our technology. In this light, our own developer velocity is what matters to us firstly, and having built so much of the existing AI infrastructure for the world – we needed to carefully consider what would enable our team to move faster. We have lived through the two-world language problem in AI – where researchers live in Python, and production and hardware engineers live in C++ – and we had no choice but to either barrel down that road, or rethink the approach entirely. We chose the latter. There was a clear need to solve this problem, but many different ways to solve it – we approached it with our strong belief of meeting the ecosystem where it is today, and enabling a simpler lift into the future. Our team bears the scars of software migration at large scale, and we didn’t want a repeat of that. We also realized that there is no language today, in our opinion, that can solve all the challenges we are attempting to solve for AI and so we undertook a first principles approach, and Mojo was born.

How does Mojo enable seamless scaling and portability across many types of hardware?

Chris, myself and our team at Google (many at Modular) helped bring MLIR into the world years ago – with the goal to help the global community solve real challenges by enabling AI models to be consistently represented and executed on any type of hardware. MLIR is a new type of open-source compiler infrastructure that has been adopted at scale, and is rapidly becoming the new standard for building compilers through LLVM. Given our team's history in creating this infrastructure, it's natural that we utilize it heavily at Modular and this underpins our state of the art approach in developing new AI infrastructure for the world. Critically, while MLIR is now being fast adopted, Mojo is the first language that really takes the power of MLIR and exposes it to developers in a unique and accessible way. This means it scales from Python developers who are writing applications, to Performance engineers who are deploying high performance code, to hardware engineers who are writing very low level system code for their unique hardware.

References to Mojo claim that it’s basically Python++, with the accessibility of Python and the high performance of C. Is this a gross oversimplification? How would you describe it?

Mojo should feel very familiar to any Python programmer, as it shares Python’s syntax. But there are a few important differences that you’ll see as one ports a simple Python program to Mojo, including that it will just work out of the box. One of our core goals for Mojo is to provide a superset of Python – that is, to make Mojo compatible with existing Python programs – and to embrace the CPython implementation for long-tail ecosystem support. Then enable you to slowly augment your code and replace non-performing parts with Mojo’s lower-level features to explicitly manage memory, add types, utilize autotuning and many other aspects to get the performance of C or better! We feel Mojo gives you get the best of both worlds and you don’t have to write, and rewrite, your algorithms in multiple languages. We appreciate Python++ is an enormous goal, and will be a multi-year endeavor, but we are committed to making it reality and enabling our legendary community of more than 140K+ developers to help us build the future together.

In a recent keynote it was showcased that Mojo is 35,000x faster than Python, how was this speed calculated?

It’s actually 68,000x now! But let's recognize that it's just a single program in Mandelbrot – you can go and read a series of three blog posts on how we achieved this – here, here and here. Of course, we’ve been doing this a long time and we know that performance games aren’t what drive language adoption (despite them being fun!) – it’s developer velocity, language usability, high quality toolchains & documentation, and a community utilizing the infrastructure to invent and build in ways we can’t even imagine. We are tool builders, and our goal is to empower the world to use our tools, to create amazing products and solve important problems. If we focus on our larger goal, it's actually to create a language that meets you where you are today and then lifts you easily to a better world. Mojo enables you to have a highly performant, usable, statically typed and portable language that seamlessly integrates with your existing Python code – giving you the best of both worlds. It enables you to realize the true power of the hardware with multithreading and parallelization in ways that raw Python today can not – unlocking the global developer community to have a single language that scales from top to bottom.

Mojo’s magic is its ability to unify programming languages with one set of tools, why Is this so important?

Languages always succeed by the power of their ecosystems and the communities that form around them. We’ve been working with open source communities for a long time, and we are incredibly thoughtful towards engaging in the right way and ensuring that we do right by the community. We’re working incredibly hard to ship our infrastructure, but need time to scale out our team – so we won’t have all the answers immediately, but we’ll get there. Stepping back, our goal is to lift the Python ecosystem by embracing the whole existing ecosystem, and we aren’t seeking to fracture it like so many other projects. Interoperability just makes it easier for the community to try our infrastructure, without having to rewrite all their code, and that matters a lot for AI.

Also, we have learnt so much from the development of AI infrastructure and tools over the last ten years. The existing monolithic systems are not easily extensible or generalizable outside of their initial domain target and the consequence is a hugely fragmented AI deployment industry with dozens of toolchains that carry different tradeoffs and limitations. These design patterns have slowed the pace of innovation by being less usable, less portable, and harder to scale.

The next-generation AI system needs to be production-quality and meet developers where they are. It must not require an expensive rewrite, re-architecting, or re-basing of user code. It must be natively multi-framework, multi-cloud, and multi-hardware. It needs to combine the best performance and efficiency with the best usability. This is the only way to reduce fragmentation and unlock the next generation of hardware, data, and algorithmic innovations.

Modular recently announced raising $100 million in new funding, led by General Catalyst and filled by existing investors GV (Google Ventures), SV Angel, Greylock, and Factory. What should we expect next?

This new capital will primarily be used to grow our team, hiring the best people in AI infrastructure, and continuing to meet the enormous commercial demand that we are seeing for our platform. Modverse, our community of well over 130K+ developers and 10K’s of enterprises, are all seeking our infrastructure – so we want to make sure we keep scaling and working hard to develop it for them, and deliver it to them. We hold ourselves to an incredibly high standard, and the products we ship are a reflection of who we are as a team, and who we become as a company. If you know anyone who is driven, who loves the boundary of software and hardware, and who wants to help see AI penetrate the world in a meaningful and positive way – send them our way.

What is your vision for the future of programming?

Programming should be a skill that everyone in society can develop and utilize. For many, the “idea” of programming instantly conjures a picture of a developer writing out complex low level code that requires heavy math and logic – but it doesn’t have to be perceived that way. Technology has always been a great productivity enabler for society, and by making programming more accessible and usable, we can empower more people to embrace it. Empowering people to automate repetitive processes and make their lives simpler is a powerful way to give people more time back.

And in Python, we already have a wonderful language that has stood the test of time – it's the world's most popular language, with an incredible community – but it also has limitations. I believe we have a huge opportunity to make it even more powerful, and to encourage more of the world to embrace its beauty and simplicity. As I said earlier, it's about building products that have progressive disclosure of complexity – enabling high level abstractions, but scaling to incredibly low level ones as well. We are already witnessing a significant leap with AI models enabling progressive text-to-code translations – and these will only become more personalized over time – but behind this magical innovation is still a developer authoring and deploying code to power it. We’ve written about this in the past – AI will continue to unlock creativity and productivity across many programming languages, but I also believe Mojo will open the ecosystem aperture even further, empowering more accessibility, scalability and hardware portability to many more developers across the world.

To finish, AI will penetrate our lives in untold ways, and it will exist everywhere – so I hope Mojo catalyzes developers to go and solve the most important problems for humanity faster – no matter where they live in our world. I think that’s a future worth fighting for.

Read More
Interviews Tim Davis Interviews Tim Davis

Data Exchange Interview

Welcome to the Data Exchange Podcast. Today we’re joined by Tim Davis, co-founder and Chief Product Officer at Modular. Their tagline says it all: The future of AI development starts here. Tim, great to have you on the show.

Interview with Tim Davis, Co-Founder of Modular. Full interview here

Ben (Host): Welcome to the Data Exchange Podcast. Today we’re joined by Tim Davis, co-founder and Chief Product Officer at Modular. Their tagline says it all: The future of AI development starts here. Tim, great to have you on the show.
Tim Davis: Great to be here, Ben—thanks for having me.

Introducing Mojo: Python, Reimagined

Ben: Let’s dive right in. What is Mojo, and what can developers use today?
Tim: Mojo is a new programming language—a superset of Python, or “Python++,” if you will. Right now, anyone can sign up at modular.com/mojo to access our cloud-hosted notebook environment, play with the language, and run unmodified Python code alongside Mojo’s advanced features.

“All your Python code will execute out of the box—you can then take performance-critical parts and rewrite them in Mojo to unlock 5–10× speedups.”

That uplift comes from our state-of-the-art compiler and runtime stack, built on MLIR and LLVM foundations.

Solving the Two-Language Problem

Many ML frameworks hide C++/CUDA complexity behind Python APIs, but that split still causes friction. Mojo bridges the gap:

  • Prototype in Python

  • Optimize in Mojo (same codebase)

“Researchers no longer need to drop into C++ for speed; they stay in one language from research to production.”

This unified model dramatically accelerates the path from idea to deployment.

Who is Mojo For?

Ben: Frameworks like TensorFlow and PyTorch already tackle performance. Who’s Mojo’s target audience?
Tim: Initially, it’s us—Modular’s own infrastructure team. But our real audience spans:

  1. Systems-level ML engineers who need granular control and performance.

  2. GPU researchers wanting a seamless path to production without rewriting code.

By meeting developers where they are, Mojo helps defragment fragmented ML stacks and simplifies pipelines.

Under the Hood: Hardware-Agnostic Design

Mojo’s architecture is built for broad hardware support:

  • MLIR (Multi-Level IR): Provides a common representation across hardware.

  • LLVM Optimizations: Powers high-performance codegen.

  • Multi-Hardware Portability: CPUs, GPUs, TPUs, edge devices, and beyond.

“We want access to all hardware types. Today’s programming model is constrained—Mojo opens up choice.”

This means you’re not locked into CUDA or any single accelerator vendor.

Beyond the Language: Unified AI Inference Engine

Modular also offers a drop-in inference engine:

  • Integrates with Triton, TF-Serving, TorchServe

  • CPUs first (batch workloads), GPUs coming soon

  • Orders-of-magnitude performance gains

“Simply swap your backend and get massive efficiency improvements—no changes to your serving layer.”

Enterprises benefit from predictable scaling and hardware flexibility, whether on Intel, AMD, ARM-based servers, or custom ASICs.

Roadmap: Community, Open Source & Enterprise

Next 6–12 Months:

  • Expand Mojo’s language features (classes, ownership, lifetimes).

  • Enable GPU execution (beyond the cloud playground).

  • Extend the inference engine to training, dynamic workloads, and full pipeline optimizations (pre-/post-processing).

“We released early to learn from real users—80,000 sign-ups across 230+ countries. Their feedback drives our roadmap.”

Why a New Language Matters

Mojo’s core value prop can be summed up in three words:

  1. Usable: Drop-in Python compatibility; gentle learning curve.

  2. Performant: Advanced compiler + runtime yields 5–10× speedups out of the box.

  3. Portable: Write once, run anywhere—from cloud GPUs to mobile CPUs.

Together, these unlock faster innovation, lower costs, and broader hardware choice.

Democratizing AI Development

In Tim’s own words:

“Our mission is to make AI development accessible to anyone, anywhere. By rethinking the entire stack, we’re unlocking a new wave of innovation and putting compute power in more hands.”

With its unified language and inference engine, Modular is ushering in a future where AI development truly starts here—for researchers, engineers, and enterprises alike.

Read More