Well, here’s my sorry. From my understanding, the EU wants (say) foundation model builders to certify that their models meet certain criteria. That’s a nice idea in itself, but there’s a risk of this certification process being too burdensome for FOSS developers of foundation models. Worse still, would the FOSS projects end up being legally liable for downstream uses of their models? Don’t forget that, unlike proprietary software with their EULAs taking liability off developers, FOSS places no restrictions on how end users use the software (in fact, any such restrictions generally make it non-FOSS).
A foundation model is not an application. It’s up to the people wanting to run AI in a high-risk scenario to make sure that the models they’re using are up to the task, if they can’t say that about some FOSS model then they can’t use it. And, honestly, would you want some CV or college application scanner involve DeepDanbooru.
The regulation not only puts obligations on users. Providers (which can include FOSS developers?) would have to seek approval for AI systems that touch on certain areas (e.g. vocational training), and providers of generative AI are liable to “design the model to prevent it from generating illegal content” and “publishing summaries of copyrighted data used for training”. The devil is in the details, and I’m not so sanguine about it being FOSS-friendly.
Well, here’s my sorry. From my understanding, the EU wants (say) foundation model builders to certify that their models meet certain criteria. That’s a nice idea in itself, but there’s a risk of this certification process being too burdensome for FOSS developers of foundation models. Worse still, would the FOSS projects end up being legally liable for downstream uses of their models? Don’t forget that, unlike proprietary software with their EULAs taking liability off developers, FOSS places no restrictions on how end users use the software (in fact, any such restrictions generally make it non-FOSS).
A foundation model is not an application. It’s up to the people wanting to run AI in a high-risk scenario to make sure that the models they’re using are up to the task, if they can’t say that about some FOSS model then they can’t use it. And, honestly, would you want some CV or college application scanner involve DeepDanbooru.
The regulation not only puts obligations on users. Providers (which can include FOSS developers?) would have to seek approval for AI systems that touch on certain areas (e.g. vocational training), and providers of generative AI are liable to “design the model to prevent it from generating illegal content” and “publishing summaries of copyrighted data used for training”. The devil is in the details, and I’m not so sanguine about it being FOSS-friendly.