The Macro: Scientists Spend More Time on Setup Than on Science
I talked to a computational physicist last year who told me that roughly 70 percent of her time goes to configuring simulations, not analyzing results. Setting up the environment. Getting dependencies to play nice. Writing job scripts for the university cluster. Debugging parameter files. Running validation checks. Waiting for queue allocation. By the time the actual computation finishes, she has spent days on infrastructure that contributes nothing to her research.
This is the dirty secret of computational science. The tools are powerful but the workflows are brutal. ANSYS, COMSOL, OpenFOAM, MATLAB, custom Fortran codes. Each has its own input format, its own quirks, its own way of failing silently. Researchers become systems administrators by necessity, maintaining environments and build chains instead of doing the science they were trained for.
The market for scientific computing tools is large and growing. ANSYS reported over $2 billion in revenue last year. COMSOL is private but profitable. Rescale and Rescale’s competitors sell cloud compute access to engineers who need HPC resources without managing hardware. But none of these companies have seriously applied AI to the workflow automation layer. They sell compute and simulation engines. The pipeline between “I have a research question” and “the simulation is running correctly” is still entirely manual.
What makes this space interesting right now is that LLMs can understand both natural language descriptions of scientific problems and the technical configuration files those problems require. The translation layer between “simulate turbulent flow over this airfoil at Mach 0.8” and the actual OpenFOAM case files needed to run that simulation is exactly the kind of structured generation that modern AI handles well.
The Micro: Three Harvard Grads With NASA and MIT Pedigrees
Fluidize is an AI platform for scientific computing that lets researchers transform experimental workflows into visual pipelines using natural language. You describe what you want to simulate, and the system handles setup, parameterization, execution, and validation. Pipelines are reproducible and shareable. You can prototype locally and then scale to cloud or cluster compute without reconfiguring anything. The underlying source code stays transparent and accessible.
The platform wraps over existing simulation stacks, which is critical. Scientists are not going to abandon ANSYS or OpenFOAM for a new tool. They need something that makes their existing tools easier to use. Fluidize integrates both open-source and licensed software, handles dependencies and versioning automatically, and provides shared collaboration dashboards for teams.
Henry Bae, Alex Fleury, and Jamin Liu founded Fluidize and the team reads like a recruiter’s wish list. Henry studied Physics and Computer Science at Harvard and worked at NASA’s JPL, Goddard, and Kennedy Space Center. Alex has a Harvard AB in Computer Science and Statistics, did AI research, and sourced six-figure monthly client revenue in consulting roles at Strategy& and Lazard. Jamin holds both a Bachelor’s and concurrent Master’s in Computer Science and Molecular Biology from Harvard, did autonomous driving research at MIT Lincoln Labs, and has prior startup scaling experience. They came through Y Combinator’s Summer 2025 batch.
The competitive landscape is sparse for AI-native scientific workflow tools. Weights & Biases and MLflow handle ML experiment tracking but not general scientific simulation. Nextflow and Snakemake are workflow managers for bioinformatics but require significant manual configuration. Nobody has built a natural-language-to-simulation-pipeline product with broad scientific domain coverage. The closest analog is what Replit did for software development, making complex setup invisible, except for scientific computing.
The Python SDK on GitHub suggests they are building for developers first, which is the right entry point. Scientists who code are the early adopters who will validate the platform and then pull in their less technical collaborators.
The Verdict
I think Fluidize is going after one of the most underserved markets in software. Computational scientists have been stuck with the same workflow pain for decades and nobody has given them modern tooling. The TAM is real. Every university, national lab, pharmaceutical company, and engineering firm running simulations is a potential customer.
The risk is breadth. Scientific computing spans hundreds of domains, each with its own tools, file formats, and workflow conventions. Supporting fluid dynamics, molecular dynamics, genomics, and materials science simultaneously is an enormous engineering challenge. If Fluidize tries to go too broad too fast, the product will feel shallow in every domain. If they go deep in one vertical first, they limit initial growth but build something genuinely useful.
In thirty days, I want to see which scientific domains they are targeting first and whether researchers in those domains actually complete workflows end-to-end using the platform. Sixty days, the question is whether the reproducibility features drive sharing and collaboration or whether each user builds isolated pipelines. Ninety days, I want to know if they have converted any university lab groups into team customers, because the viral loop in academic science goes through lab groups, not individuals. The founding team has the credentials and the technical range to build something important here. The market is waiting for exactly this product.