Open Source AI is here

Open-source AI is here - Nextcloud

Today, the Open Source Initiative has released its first official definition of Open Source AI. This is an important milestone. Let me explain why.

Why Open Source matters

In a world where speech depends on software, free speech depends on free software.

The key tenet of open source is is that it puts the user in control. Software is ever growing in complexity and importance in our society. Software is how we do our work, how we communicate, how we pay, how we access information. When software is a black box, subject to subtle (or not so much) manipulation, users are at risk.

Risks of AI

AI brings these risks to an entirely new level. Not only because it makes decisions that are often entirely intransparent, but also because its easy, human like interface often lulls users to trust it far more than deserved.

The big AI firms have done their level best to ensure the attention of the public around risks with AI were aimed at existential, contrived notions akin to Skynet, the AI in the movie series Terminator, scenarios where AI would take over the world. In reality, the risks associated with AI are far more mundane. Surveillance, bias, explosive energy usage and job losses are the concerns we should focus on.

Need for control

And, just like with software, what matters is control. Who controls the AI, who makes decisions about what it can and can’t do, what goes in and what does not. With control, we can address the real risks of AI. Without control, we can simply hope that the billion dollar companies do the right thing.

They haven’t in the past. So we need Open Source AI. AI that that gives users the ability to study and modify the AI models that govern their lives.

Nextcloud Ethical AI Rating 🟢🟡🟠🔴

Nextcloud gave this a first shot in March 2023, when we launched our Ethical AI Rating. A simple traffic light would show with green/yellow/orange/red if a given AI model was freely available, if its data was publicly available and if the code needed to run and train it was open source. This way we help users make an informed decision without restricting their choice of models and features.

Nextcloud Ethical AI

Users of AI solutions deserve transparency and control, which is why we introduced our Ethical AI rating in early 2023. Now, we see big tech firms trying to hijack the term open source AI. We fully endorse the creation of a clear definition of open source AI by the community to protect users and the market.

Frank Karlitschek
CEO and fouder of Nextcloud

The wider open source community has picked up the gauntlet as well, and after extensive consultation with the community, today the OSI has announced an official definition of Open Source AI. This will help users, from private users, to governments, research institutes, hospitals and businesses, to make decisions what systems they can trust.

So, today, It is a first step in a journey and we are glad to be a part of it. Nextcloud has formally endorsed the definition, even if we think there is room for improvement. We will use it as a basis for our Ethical AI Rating. Our rating is a bit more granular and also more critical in some areas – for example, when it comes to data, we believe it should be always fully available – and thus, for now, we will keep using it, as it fits the use cases of our users more.

We look forward to your input, both on the OSI definition – on the road to a 2.0 – and on our AI rating.

Start the discussion at the
Nextcloud forums

Go to Forums