Nextcloud Hub 9: Be connected
Mit Nextcloud Hub 9 bleiben Sie vernetzt. Neue Federation-Funktionen, Automatisierung von Workflows, neues Design und vieles mehr!
Mehr lesenToday, the Open Source Initiative has released its first official definition of Open Source AI. This is an important milestone. Let me explain why.
The key tenet of open source is is that it puts the user in control. Software is ever growing in complexity and importance in our society. Software is how we do our work, how we communicate, how we pay, how we access information. When software is a black box, subject to subtle (or not so much) manipulation, users are at risk.
AI brings these risks to an entirely new level. Not only because it makes decisions that are often entirely intransparent, but also because its easy, human like interface often lulls users to trust it far more than deserved.
The big AI firms have done their level best to ensure the attention of the public around risks with AI were aimed at existential, contrived notions akin to Skynet, the AI in the movie series Terminator, scenarios where AI would take over the world. In reality, the risks associated with AI are far more mundane. Surveillance, bias, explosive energy usage and job losses are the concerns we should focus on.
And, just like with software, what matters is control. Who controls the AI, who makes decisions about what it can and can’t do, what goes in and what does not. With control, we can address the real risks of AI. Without control, we can simply hope that the billion dollar companies do the right thing.
They haven’t in the past. So we need Open Source AI. AI that that gives users the ability to study and modify the AI models that govern their lives.
Nextcloud gave this a first shot in March 2023, when we launched our Ethical AI Rating. A simple traffic light would show with green/yellow/orange/red if a given AI model was freely available, if its data was publicly available and if the code needed to run and train it was open source. This way we help users make an informed decision without restricting their choice of models and features.
The wider open source community has picked up the gauntlet as well, and after extensive consultation with the community, today the OSI has announced an official definition of Open Source AI. This will help users, from private users, to governments, research institutes, hospitals and businesses, to make decisions what systems they can trust.
So, today, It is a first step in a journey and we are glad to be a part of it. Nextcloud has formally endorsed the definition, even if we think there is room for improvement. We will use it as a basis for our Ethical AI Rating. Our rating is a bit more granular and also more critical in some areas – for example, when it comes to data, we believe it should be always fully available – and thus, for now, we will keep using it, as it fits the use cases of our users more.
We look forward to your input, both on the OSI definition – on the road to a 2.0 – and on our AI rating.