#computing

6 posts · Last used 1d

Back to Timeline
theguardian_us_technology
@theguardian_us_technology@halo.nu · 2d ago
1
1
1
TheBadPlace
@TheBadPlace@mastodon.ozioso.online · Apr 28, 2026
The Guardian | Tell us: have you become emotionally attached to AI? by Guardian community team AI generated summary, Read the full article for complete information. The Guardian is inviting readers who have formed emotional attachments to AI chatbots to share their personal experiences. The call‑out asks people aged 18 or over to describe how they use AI chatbots, the nature of their relationship, and any relevant background details, with the option to submit anonymously. Responses will be encrypted, kept secure, and used solely for a forthcoming feature, after which any personal data will be deleted. Participants can also indicate whether they are willing to be contacted for further discussion or featured in audio or video pieces. Read more: https://www.theguardian.com/technology/2026/apr/28/tell-us-have-you-become-emotionally-attached-to-ai #aiartificialintelligence #computing
0
0
0
Linux
@Linux@linuxrocks.online · Apr 25, 2026
Time to switch to the freedom of Linux - people are indeed doing it increasingly. E.g. Steam recently passed 5% for Linux users 🐧 👉 https://youtu.be/uWnM16cpINk #Linux #deleteMicrosoft #Recall #freedom #computing #Microslop
19
0
5
negativepid
@negativepid@mastodon.social · Mar 15, 2026
2
0
1
In reply to
meltedcheese
@meltedcheese@c.im · Nov 17, 2025
@cstross@wandering.shop @skjeggtroll@mastodon.online Moore’s Law is dead for now. I did a study a few years ago to look at what is happening and will happen in microprocessors. Short story is that traditional processor architecture is hitting end of life. Feature sizes are so small now that quantum effects are a significant factor. High speed and small size also means we are up against a thermal barrier as well. Clever approaches with System-On-a-Chip (#SOA), 3D stacking, maybe Processor-In-Memory (#PIM) and distributed multiprocessing will squeeze out more progress for maybe a decade. After that comes the next computing revolution — a shift to non-Von Neumann #computing. #Quantum has the spotlight because that’s the really big win, but there are other approaches that are likely to be commercially viable before quantum is mature. I’m optimistic about the tech, less so about the rate of adoption and change that will be required, especially if the most talented early- career computer scientists and engineers keep chasing the associative/statistical methods that include LLMs.
28
3
16

You've seen all posts