Sunday, May 12, 2024

I think he may have a point.

https://academic.oup.com/ojls/advance-article-abstract/doi/10.1093/ojls/gqae017/7665668

The Data Crowd as a Legal Stakeholder

This article identifies a new legal stakeholder in the data economy: the data crowd. A data crowd is a collective that: (i) is unorganised, non-deliberate and unable to form an agenda; (ii) relies on productive aggregation that creates an interdependency among participants; and (iii) is subjected to an external authority. Notable examples of crowds include users of a social network, users of a search engine and users of artificial intelligence-based applications. The law currently only protects users in the data economy as individuals, and in certain cases may address broad public concerns. However, it does not recognise the collective interests of the crowd of users and its unique vulnerability to platform power. The article presents and defends the crowd’s legal interests in a stable infrastructure for participation. It therefore reveals the need for a new approach to consumers’ rights in the data economy.





Tools & Techniques.

https://techcrunch.com/2024/05/11/u-k-agency-releases-tools-to-test-ai-model-safety/

U.K. agency releases tools to test AI model safety

The U.K. Safety Institute, the U.K.’s recently established AI safety body, has released a toolset designed to “strengthen AI safety” by making it easier for industry, research organizations and academia to develop AI evaluations.

Called Inspect, the toolset — which is available under an open source license, specifically an MIT License — aims to assess certain capabilities of AI models, including models’ core knowledge and ability to reason, and generate a score based on the results.

In a press release announcing the news on Friday, the Safety Institute claimed that Inspect marks “the first time that an AI safety testing platform which has been spearheaded by a state-backed body has been released for wider use.”



No comments: