FinalSpark - but honey, can it play chess?

Discussion of anything and everything relating to chess playing software and machines.

Moderators: hgm, Rebel, chrisw

Posts: 2797
Joined: Wed Mar 10, 2010 10:18 pm
Location: Hamburg, Germany
Full name: Srdja Matovic

FinalSpark - but honey, can it play chess?

Post by smatovic »

World's First Bioprocessor Uses 16 Human Brain Organoids, Consumes Less Power ... less-power
"A Swiss biocomputing startup has launched an online platform that provides remote access to 16 human brain organoids," reports Tom's Hardware: FinalSpark claims its Neuroplatform is the world's first online platform delivering access to biological neurons in vitro. Moreover, bioprocessors like this "consume a million times less power than traditional digital processors," the company says.
In a recent research paper about its developments, FinalSpakr claims that training a single LLM like GPT-3 required approximately 10GWh — about 6,000 times greater energy consumption than the average European citizen uses in a whole year. Such energy expenditure could be massively cut following the successful deployment of bioprocessors.
The operation of the Neuroplatform currently relies on an architecture that can be classified as wetware: the mixing of hardware, software, and biology.
"While a wetware computer is still largely conceptual, there has been limited success with construction and prototyping, which has acted as a proof of the concept's realistic application to computing in the future."
World's first bioprocessor uses 16 human brain organoids for ‘a million times less power’ consumption than a digital chip ... gital-chip

I predicted that with reaching the 8 billion humans mark, we will have developed another ground breaking computing technology, similar to the advent of the transistor, IC and microchip, looks like Wetware is a candidate, or alike.

User avatar
Posts: 11751
Joined: Thu Mar 09, 2006 12:57 am
Location: Birmingham UK

Re: FinalSpark - but honey, can it play chess?

Post by towforce »

One word for that article: hype.

Wetware has been around since the 1990s (in nature it has been around about 500 million years - since the Cambrian Explosion) - but nobody is going to be using it for AI.

Here are some more realistic future scenarios - link.

The article correctly states that LLMs like GPT take a lot of power to train - but nobody is ever going to be training a model that large in wetware.
The simple reveals itself after the complex has been exhausted.