Something's been slowly shifting in the design zeitgeist. I've been watching my feed on X and the vibe has changed. More and more, I see designers sharing finished experiments or prototypes they coded themselves, rather than static Figma files. Moving from working on a canvas to talking to an LLM. The conversation isn't "here's a design I made" anymore... it's "here's something I shipped this afternoon."
I have ADHD and have found Home Assistant to be a valuable tool for managing executive dysfunction. I use it for audible calendar reminders, laundry reminders, timers, and monitoring my doorbell camera and my nanny cam for my dog. Its also a great source of pure nerdy joy for me. And I recently took the most joyously nerdy step yet in my home automation fixation.
When Sonos released its redesigned app in May 2024, the backlash was immediate and brutal. Users couldn't access basic features like volume control and alarms. Systems became unusable. The company's stock plummeted 25%. Eventually, the CEO was replaced, and lawsuits claimed over $5 million in damages from customers who'd lost functionality they'd paid for.
Hi everyone, I'm a solo developer who recently built a fan-made tool for the Roblox game https://www.forgeore.com The main goal of the site is to help players: Calculate forging probabilities based on different ore combinations Automatically find optimal ore recipes using a Smart Optimizer (this is the unique part) Browse a complete database of all 88 in-game ores with stats Share their builds via URL links for easy discussion in Discord/forums
One of the first places users notice gaps in visibility is Instagram Stories. The platform tells you who viewed a story, but it does not tell you who wanted to look without being noticed. That absence shapes behaviour. People avoid checking stories to prevent awkward signals, misunderstandings, or emotional reactions. How Instagram obscures story viewing and follower context Tools like the insta story viewer by FollowSpy exist
There's a particular kind of guilt that visits me when I open my feed reader after a few days away. It's not the guilt of having done something wrong, exactly. It's more like the feeling of walking into a room where people have been waiting for you, except when you look around, the room is empty. There's no one there. There never was.
The question dropped into the Slack channel before the user research summary. Before the problem was clearly defined. Before anyone asked if users actually needed this feature. Your product manager already generated three interface options in ChatGPT. Now they're asking which one to build. Not whether to build. Not why to build. Which. And when you slow the conversation down to ask those questions, you're about to discover that strategic thinking now reads as bottleneck behavior.
Using a pre-built template strategy: The Atlassian team realized that AI was often messing up core elements and not completely understanding complex commands. So they created a sort of "design system" for their AI led prototyping. Here they feed a page with pre-coded elements which AI doesn't change, but lets the tool work on other elements which are open to interpretation in a way.
My role was straightforward: write queries (prompts and tasks) that would train AI agents to engage meaningfully with users. But as a UXer, one question immediately stood out - who are these users? Without a clear understanding of who the agent is interacting with, it's nearly impossible to create realistic queries that reflect how people engage with an agent. That's when I discovered a glitch in the task flow.
Google is testing the ability to add color to your search results page with a color palette picker. There is this palette icon at the top of the results that when clicked on says "Add a splash of color to the top of Search," and "Pick color." When you click on pick color it loads 10 colors to "Choose an accent color for the top of Search."
To be honest, for many years, I was mostly reacting. Life was happening to me, rather than me shaping the life that I was living. I was making progress reactively and I was looking out for all kinds of opportunities. It was easy and quite straightforward - I was floating and jumping between projects and calls and making things work as I was going along.
I would like to know why Adobe took all the user design interface away. This new version is VERY clunky, doesn't allow for manipulation of the elements, and really brings down my ability to create a quality product. PowerPoint has better functionality than this.
The other day I was browsing YouTube - as one does - and I clicked a link in the video description to a book. I was then subjected to a man-in-the-middle attack, where YouTube put themselves in the middle of me and the link I had clicked: Hyperlinks are subversive. Big Tech must protect themselves and their interests.
AI design tools are everywhere right now. But here's the question every designer is asking: Do they actually solve real UI problems - or just generate pretty mockups? To find out, I ran a simple experiment with one rule: no cherry-picking, no reruns - just raw, first-attempt results. I fed 10 common UI design prompts - from accessibility and error handling to minimalist layouts - into 5 different AI tools. The goal? To see which AI came closest to solving real design challenges, unfiltered.
Most design problems aren't 'design' problems. They're 'Thinking' problems.They're 'Clarity' problems.They're 'Too-many-tabs-open' problems. More prototyping. More pixel-shifting. More polish in Figma alone isn't going to help you with those. For me, without clear thinking, Figma just results in more confusion, more mess, and more mockups than I can mentally manage. The Problem: Figma wasn't the bottleneck - my thinking was
WCAG is not normatively stating focus must be trapped within a dialog. Rather, the normative WCAG spec makes zero mention of requirements for focus behavior in a dialog. The informative 2.4.3 focus order understanding doc does talk about limiting focus behavior within a dialog - but again, this is in the context of a scripted custom dialog and was written long before inert or <dialog> were widely available.
A comprehensive UX audit of all the app was the first thing I needed to do - to identify friction and suggest improvements. After reviewing the app and particular features, I had a list of flaws and possible improvements. And I had an idea - what if I screenshot the app and send it to Figma make with instructions, based on my findings. The result truly impressed me and, naturally, I became excited to show it to my client.
My role was straightforward: write queries (prompts and tasks) that would train AI agents to engage meaningfully with users. But as a UXer, one question immediately stood out - who are these users? Without a clear understanding of who the agent is interacting with, it's nearly impossible to create realistic queries that reflect how people engage with an agent. That's when I discovered a glitch in the task flow. There were no defined user archetypes guiding the query creation process. Team members were essentially reverse-engineering the work: you think of a task, write a query to help the agent execute it, and cross your fingers that it aligns with the needs of a hypothetical "ideal" user - one who might not even exist.
service blueprints, information architecture diagrams, and funnels are foundational tools in product design, but they're often time-consuming to create and even harder to keep up to date. But what if we hire AI to do the job, and not just a random AI tool, but... Figma Make. I've already demonstrated how you can use Figma Make for UI design exploration and quick prototyping, but this tool also allows you to generate structured design artefacts directly from prompts, turning abstract thinking into tangible visuals within minutes.