microwhore.png
[Hide] (88.7KB, 1393x690) microshaft.jpeg
[Hide] (208.5KB, 770x1221) microwhore2.png
[Hide] (94.5KB, 991x515) Crossposted from >>>/k/53949
An unholy union of Microshit, BBC, jewtel, Arm, Adobe have formed a group called "Coalition for Content Provenance and Authenticity" (C2PA) back in 2021 ( https://c2pa.org/ ) which is now testifying in front of congress on using metadata and secure algorithms to prevent AI generated shit from being "Fake news". They will use AI as the bogeyman to enslave us under the yolk of technology since it would be to dangerous to let man think freely...
How it works:
Basically it will store a hash of your picture generated from the file and your private key. No botnet system will accept "unsigned" files.
Some key points:
The "Harms modelling" PDF notes several of the risks in this and deliberately downplays a few. For example:
>Political dissidents being tracked through C2PA manifest repositories, or 'bad actors' demanding manifest repositories to release sensitive information
Impact scale and likelyhood "low"
>Forced association (Requiring participation in the use of technology or surveillance to take part in society) De facto inclusion and participation obligation in marketplaces for creative content or journalistic content or for better algorithmic ranking on social media sites which disproportionately excludes global populations, marginalized communities and non-mainstream media who do not have access to relevant tools, or cannot consistently use tools because of privacy or other reasons
Severity, scale and likelyhood: Low, frequency: High
>Loss of freedom of movement or assembly to navigate the physical or virtual world with desired anonymity C2PA-enabled systems that utilize a real-name identity or other real-world profile provide a mechanism to connect movement in space to an individual via C2PA metadata
Severity: Low. Scale: Low. Likelyhood: Low. frequency: High
This one is straight up bullshit.
There will be an "optional" signing feature using your government issued ID. They'll scrape the metadata like the good opportunists they are.
>Similarly, certain C2PA claim generators may allow content creators, including civic, community and independent media, to sign manifests with their personal certificates associated with their IDs. Although guidance to allow for anonymity and pseudonymity has been issued, specification-compliant tools may sell information to third-parties, or not follow user experience guidance meant to empower users to retain control of their information.
Microshit is working on using AI to replace content moderators from india, and is trying to get hardware implementations globally.
https://www.pcworld.com/article/1923811/microsoft-will-id-its-ai-art-with-a-hidden-watermark.html
https://learn.microsoft.com/en-us/legal/cognitive-services/content-safety/transparency-note
>Pic related
>The concern over AI content has become so critical that camera makers Leica and Nikon are building the C2PA standard directly into cameras to authenticate images as real, and not AI-generated, according to Adobe.
Some quotes from the congressional testimony.
>Microsoft:"reputation management systems" for media uploaded online. "There's nothing wrong with misinformation, there's something wrong in not knowing where it comes from."
>Universal Music Group:"We should use metadata to see what is used in ML since that is likely "violative [sic] so that we must bring them to the negotiating table"
>Adobe: "We propose that congress should impose a new 'anti-impersonation' right, that would give artists a right to assert against someone intentionally attempting to impersonate their style or likeness"
They want only media uploaded to the major sites to be "approved". Soon, everything you post, everything you use will require the internet and be linked back to your real name forever. The panopticon is here and ever watching...