A foundation model is not an application. It’s up to the people wanting to run AI in a high-risk scenario to make sure that the models they’re using are up to the task, if they can’t say that about some FOSS model then they can’t use it. And, honestly, would you want some CV or college application scanner involve DeepDanbooru.
The regulation not only puts obligations on users. Providers (which can include FOSS developers?) would have to seek approval for AI systems that touch on certain areas (e.g. vocational training), and providers of generative AI are liable to “design the model to prevent it from generating illegal content” and “publishing summaries of copyrighted data used for training”. The devil is in the details, and I’m not so sanguine about it being FOSS-friendly.
5e. This Regulation shall not apply to
AI components provided under free and
open-source licences except to the extent
they are placed on the market or put into
service by a provider as part of a high-risk
AI system or of an AI system that falls
under Title II or IV. This exemption shall
not apply to foundation models as defined
in Art 3.
Interesting, no foundation model exception, though the FLOSS community isn’t going to train any of those soon in any case.
Or am I reading that wrong and the “unless placed on the market” is the exemption that shall not apply, not the whole of 5e. Gods.
More broadly speaking this is the same issue as with the cyber resilience act and they’re definitely on top of it as to saying “we don’t want FLOSS to suffer by a misinterpretation of ‘to put on the market’”. Patience, none of this is as of yet law but the very act of amending it such tells courts to not interpret it that way.
A foundation model is not an application. It’s up to the people wanting to run AI in a high-risk scenario to make sure that the models they’re using are up to the task, if they can’t say that about some FOSS model then they can’t use it. And, honestly, would you want some CV or college application scanner involve DeepDanbooru.
The regulation not only puts obligations on users. Providers (which can include FOSS developers?) would have to seek approval for AI systems that touch on certain areas (e.g. vocational training), and providers of generative AI are liable to “design the model to prevent it from generating illegal content” and “publishing summaries of copyrighted data used for training”. The devil is in the details, and I’m not so sanguine about it being FOSS-friendly.
Ok here’s what parlimant passed, ie. its amendments
Quoth:
Interesting, no foundation model exception, though the FLOSS community isn’t going to train any of those soon in any case.
Or am I reading that wrong and the “unless placed on the market” is the exemption that shall not apply, not the whole of 5e. Gods.
More broadly speaking this is the same issue as with the cyber resilience act and they’re definitely on top of it as to saying “we don’t want FLOSS to suffer by a misinterpretation of ‘to put on the market’”. Patience, none of this is as of yet law but the very act of amending it such tells courts to not interpret it that way.
In case you have use for it, the base version the parliament diffed against. Why aren’t they using proper VCS in <currentyear>.