UX design
fromblog.logrocket.com
8 hours agoWhich UI libraries/frameworks support the Linear aesthetic? - LogRocket Blog
Separate customers from users to improve research, prioritization, and messaging across B2C, B2B, and B2B2C products.
The question dropped into the Slack channel before the user research summary. Before the problem was clearly defined. Before anyone asked if users actually needed this feature. Your product manager already generated three interface options in ChatGPT. Now they're asking which one to build. Not whether to build. Not why to build. Which. And when you slow the conversation down to ask those questions, you're about to discover that strategic thinking now reads as bottleneck behavior.
Using a pre-built template strategy: The Atlassian team realized that AI was often messing up core elements and not completely understanding complex commands. So they created a sort of "design system" for their AI led prototyping. Here they feed a page with pre-coded elements which AI doesn't change, but lets the tool work on other elements which are open to interpretation in a way.
My role was straightforward: write queries (prompts and tasks) that would train AI agents to engage meaningfully with users. But as a UXer, one question immediately stood out - who are these users? Without a clear understanding of who the agent is interacting with, it's nearly impossible to create realistic queries that reflect how people engage with an agent. That's when I discovered a glitch in the task flow.
Google is testing the ability to add color to your search results page with a color palette picker. There is this palette icon at the top of the results that when clicked on says "Add a splash of color to the top of Search," and "Pick color." When you click on pick color it loads 10 colors to "Choose an accent color for the top of Search."
To be honest, for many years, I was mostly reacting. Life was happening to me, rather than me shaping the life that I was living. I was making progress reactively and I was looking out for all kinds of opportunities. It was easy and quite straightforward - I was floating and jumping between projects and calls and making things work as I was going along.
I would like to know why Adobe took all the user design interface away. This new version is VERY clunky, doesn't allow for manipulation of the elements, and really brings down my ability to create a quality product. PowerPoint has better functionality than this.
The other day I was browsing YouTube - as one does - and I clicked a link in the video description to a book. I was then subjected to a man-in-the-middle attack, where YouTube put themselves in the middle of me and the link I had clicked: Hyperlinks are subversive. Big Tech must protect themselves and their interests.
During my eight years working in agile product development, I have watched sprints move quickly while real understanding of user problems lagged. Backlogs fill with paraphrased feedback. Interview notes sit in shared folders collecting dust. Teams make decisions based on partial memories of what users actually said. Even when the code is clean, those habits slow delivery and make it harder to build software that genuinely helps people.
WCAG is not normatively stating focus must be trapped within a dialog. Rather, the normative WCAG spec makes zero mention of requirements for focus behavior in a dialog. The informative 2.4.3 focus order understanding doc does talk about limiting focus behavior within a dialog - but again, this is in the context of a scripted custom dialog and was written long before inert or <dialog> were widely available.
A comprehensive UX audit of all the app was the first thing I needed to do - to identify friction and suggest improvements. After reviewing the app and particular features, I had a list of flaws and possible improvements. And I had an idea - what if I screenshot the app and send it to Figma make with instructions, based on my findings. The result truly impressed me and, naturally, I became excited to show it to my client.
My role was straightforward: write queries (prompts and tasks) that would train AI agents to engage meaningfully with users. But as a UXer, one question immediately stood out - who are these users? Without a clear understanding of who the agent is interacting with, it's nearly impossible to create realistic queries that reflect how people engage with an agent. That's when I discovered a glitch in the task flow. There were no defined user archetypes guiding the query creation process. Team members were essentially reverse-engineering the work: you think of a task, write a query to help the agent execute it, and cross your fingers that it aligns with the needs of a hypothetical "ideal" user - one who might not even exist.
"People talk most about": In several examples, this is just a straight-up list of exactly three menu items. For one spot, it was literally just: omakase, sake, nigiri. "People love to order": This section gets a star icon and goes deeper than just listing names. It includes brief descriptions of specific dishes, like noting a Michigan Roll has a "generous portion of tuna."
AI is disrupting more than the software industry, and is doing so at a breakneck speed. Not long ago, designers were deep in Figma variables and pixel-perfect mockups. Now, tools like v0, Lovable, and Cursor are enabling instant, vibe-based prototyping that makes old methods feel almost quaint. What's coming into sharper focus isn't fidelity, it's foresight. Part of the work of Product Design today is conceptual: sensing trends, building future-proof systems, and thinking years ahead.