A couple of months ago, Microsoft added generative AI features to Windows 11 in the form of a taskbar-mounted version of the Bing chatbot. Starting this summer, the company will be going even further, adding a new ChatGPT-driven Copilot feature that can be used alongside your other Windows apps. The company announced the change at its Build developer conference alongside another new batch of Windows 11 updates due later this year. Windows Copilot will be available to Windows Insiders starting in June.
Like the Microsoft 365 Copilot, Windows Copilot is a separate window that opens up along the right side of your screen and assists with various tasks based on what you ask it to do. A Microsoft demo video shows Copilot changing Windows settings, rearranging windows with Snap Layouts, summarizing and rewriting documents that were dragged into it, and opening apps like Spotify, Adobe Express, and Teams. Copilot is launched with a dedicated button on the taskbar.
Windows is getting an upgraded Clippy, one that shares its name with the biggest copyright infringement and open source license violation in history. In fact, some of the Windows Copilot features are built atop the Github Copilot, such as the new “AI” features coming to Windows Terminal. Now you can get other people’s code straight into your terminal, without their permission, and without respecting their licenses. Neat!
I wonder how long it’ll take for someone in EU to sue Microsoft for anti-competitive practices again. Hopefully this time it’ll happen before all the damage is done.
It’s unclear to me how a copyright judge would rule on this and I have to confess I’m not sure how I feel about it myself either.
AI is very transformative and I’m hesitant to support a blanket ban on AIs learning from public content and works available online. After all, we don’t apply that standard to humans. As a human developer, I can read copyrighted books and open source code and learn how to technically accomplish something as a result of reading that work, I can and will use that knowledge in my work. Am I wrong for doing so? Outside of things that are trade secrets, it’s traditionally implied that readers have the right to learn and use their newfound knowledge without any permission at all. So to dictate that AI may not learn anything from existing works without the author’s permission seems to excessive. It creates one standard for humans and another for AI. I’m not comfortable with this. Obviously if an AI implementation is actually violating copyrights, then it should be addressed. But it seems wrong to me that we should cordon off the internet and other public works with “human-only” yellow tape. Even school textbooks, medial journals, science journals, news papers, and osnews comments would be off limits since they’re copyrighted without explicit permission given to AI.
The result of such restrictions would lead to AI training being heavily skewed towards public domain sources from a century ago with huge gaps in modern knowledge and expertise. I’m not enamored by microsoft, but just in terms of AI I don’t know this would be the best outcome for society.