Three remarkable things in AI this week: Google and open-source, protein-folding, and Snap

Techtonic
2 min read1 day ago

--

Do I feel shame for using this as the cover image instead of the protein one? I most certainly do. Source: Snap

Google keeps the pace in open source

What happened: Google released the Gemma 2 family of models to the public. Gemma is the open-source cousin (sibling?) of Gemini, its flagship, proprietary LLM. The release announcement listed a wide range of improvements, with a particular focus on efficiency and speed, as well as integration into a wide range of common tools and frameworks.

Why it matters: The original release of Gemma was Google’s hedge against the possibility that Meta might be right, and the future of LLMs might be open-source (or rather, the kind of pseudo-open flexible licensing that both Llama and Gemma use). It’s no surprise that Google has kept pushing down this path, and probably for the same reason as before–if LLMs do move towards being open, it doesn’t want Meta to own the space outright. Following Apple’s recent release of open-source models, and Microsoft’s toe in the water, three of the largest players in the space are now hedging their bets.

Using generative AI to discover novel proteins

What happened: Evolutionary Scale, a startup that builds AI tools to discover new proteins, released ESM3, their new model, in a paper with the snappy title, “Simulating 500 million years of evolution with a language model.” The paper also announces the discovery of a novel fluorescent protein very different from any previous known. Finally, the company has raised $142m in a seed round.

Why it matters: Protein folding is a highly complex subject with significant implications for biology and medicine. It’s been discussed as a candidate for various frontier technologies over the years, and most recently for LLMs. ESM3 is a practical demonstration that this can be done at scale; it seems likely that the pace of discovery is only going to increase. This kind of research is one of the unambiguously positive uses of artificial intelligence.

At last, our long wait is over

What happened: Snap released a new version of Lens Studio that includes generative AI tools. Lens Studio is the developer suite to build inside Snap’s world, so this means that you’ll be seeing AI-generated effects coming soon.

Why it matters: Look, it’s great that you are already able to “make your lips plump and pouty with the Snapchat Big Lips lens.” But surely there must be more. Will we finally be able to have uncanny-valley AI slop without leaving the Snap ecosystem? I hope you appreciate this update, because the searches I conducted to find that last example will haunt my dreams.

Three remarkable things is a more-or-less weekly roundup of interesting events from the AI community over the past more-or-less week

--

--

Techtonic

I'm a company CEO and data scientist who writes on artificial intelligence and its connection to business. Also at https://www.linkedin.com/in/james-twiss/