13 Matching Annotations
  1. Last 7 days
    1. Walkthroughs77 Polson, P. G., Lewis, C., Rieman, J., & Wharton, C. (1992). Cognitive walkthroughs: a method for theory-based evaluation of user interfaces. International Journal of Man-Machine Studies.  are methods where an expert (that would be you, novice designer), defines tasks, but rather than testing those tasks with real people, you walk through each step of the task and verify that a user would know to do the step, know how to do the step, would successfully do the step, and would understand the feedback the design provided. If you go through every step and check these four things, you’ll find all kinds of problems with a design.

      This step is key to designing a good workflow. In my prior work as a product designer. It helped me realize one of the biggest flaws in my design process, which is that I jump into prototyping or working on some form of visual elements before I thoroughly think through the workflow itself. This often results in big gaps in the workflow due to the design being based off of my experience and understanding.

      For example, one of the workflows I've designed was a feature to help users with cognitive disabilities to identify a position of interest (e.g. cashier), but one of the input source I used to determine the result was industry of interest, which doesn't make much sense of it now that I look back, since you can't expect most job seekers with cognitive disabilities to know whether they like to work in retail or hospitality, etc.

    1. We’re here to test this system, not you, so anything that goes wrong is our fault, not yours.

      This is one of the most common problems I've encountered during my time working on projects. I've found that people are very self-conscious during usability tests. One thing that worked for me is to remind users multiple times during the session (e.g. embedding a reminder text into the software we're testing or verbally reminding the users of this)

  2. Oct 2025
    1. Conventions are design patterns (combinations of design decisions) that people have already learned

      This is highly dependent on what users are used to and reminds me of a story I heard about designing a music streaming app for rural users in the US for a mobile company. The product manager chose an extremely simple layout (e.g. pre-built playlist in the home screen, 2 clicks to play music), showing how important it is to factoring in your users's technical literacy when designing an UI.

    1. As you can see, prototyping isn’t strictly about learning to make things, but also learning how to decide what prototype to make and what that prototype would teach you. These are judgements that are highly contextual because they depend on the time and resources you have and the tolerance for risk you have in whatever organization you’re in.

      Agreed and again, this just reminds me that design thinking requires extensive research and evaluation. The example of the pizza app just reminds me that you can't simply design a good product by prototype -> test -> improve. Somewhere within that process, you must stop to evaluate whether this is the right problem to solve and if you even have the correct solution, and how you can test these hypothesis.

    1. Research has shown that, compared with the better educated and better informed, less educated and less informed respondents have a greater tendency to agree with such statements. This is sometimes called an “acquiescence bias”

      Agreed. Another thought I had is that people frequently find it more difficult to challenge a statement than to agree with it, so sometimes it's also important to factor in that people can be less inclined to challenge a faulty/inaccurate assumption just because they might have a hard time articulating their disagreement or because they've had less experience exercising critical thinking where they're encouraged to challenge an assumption/the status quo.

    1. Most any method and medium can work, as long as you can clearly see the comparison data points, share with your team & stakeholders, and make data-driven decisions for your design solution.

      Agreed. For the longest time I was searching for a "perfect framework" that would allow me to perform the most thorough competitive analysis. I then realized that the nature of the work was to show comparison, and that the methods should center around the stakeholders we're trying to communicate to.

    1. From a design justice perspective, this might mean arranging a critique session not with other designers, but with stakeholders, asking them to bring their lived experience and knowledge of their domain to critically analyzing your design.

      I strongly agree with this suggestion because it brings us back to the core of the principle of design: designing for the user. But this practice also faces some of the same constraints as conducting user interviews: people will not always give you the honest answers due to perceived barriers or wanting to come off a certain way in order to meet social expectations. One method I found interesting to try is to gather a group of users/stakeholders and a facilitator who might be the designer, but is undisclosed to the participants, therefore removing frictions for the participants giving critiques out of their fear of offending/hurting the designer.

    1. These are not big, challenging questions to ask, they’re just big, challenge questions to answer.

      I often find that the questions that are worth solving are the most complex to answer, but the problems that I'm most capable of solving are often very niche and specific usage scenarios (e.g. filtering emails into 3 separate buckets). So I think a big part of being creative is also knowing where to set your scopes & boundaries. Often times it's more effective to solve part of the problem than trying to tackle the whole (e.g. Tesla solved the entire EV infrastructure from production to charging & maintenance, but not all EV companies need to do that in order to be considered an effective solution).

    2. Externalize often. The more you express those ideas—in words, in sketches, in prototypes, in demos—the more visible those flaws will be to you and other people.

      This is harder than it sounds because people often fear critiques. A great way I've found that helps is to leverage AI to critique your ideas. There are caveats to this though: use the AI too much and it'll start brainstorming for you, plus a lot of the suggestions it makes could be irrelevant/generic. From my practice, I found it the most helpful in inspiring me to think about the aspects that I missed.

  3. Sep 2025
    1. Better, right? It shows the scale of the problem and it shows multiple consequences of the problem. It even adds a bit of context to the problem, talking about weeknights specifically and the types of food that Americans can’t enjoy.

      An interesting observation I made is that when I'm reading about the gradually more specific problem statements, my brain becomes more excited about the problem at hand and starts actively mapping the problem to potential solutions and trying to envision what else I don't see about the problem. The value in writing a detailed, data backed problem scenario/statement is that it not only captures a more accurate issue, but it also prompts us to dive deeper into figuring out the rest of the puzzle and encourages more solution brainstorming.

    1. Because everyone’s problems are personal and have different causes and consequences, there is no such thing as the “average user”77 Trufelman, A. (2016). On average. 99% Invisible. . Every single solution will meet some people’s needs while failing to meet others.

      Agreed. I've once looked into developing a small widget tool on the phone to quickly convert currencies. While looking into potential usage cases, I quickly realized that it's just impossible to take into account everyone's consideration even for a feature that's so simple. With the cost of developing apps lowering everyday, I wonder if in the future, a lot more tools (at least on the software aspect) will become a lot more customizable to adapt to the user needs. Instead of logging into an app and being prompted with a list of options, the app asks you what you wish to accomplish and how you prefer to accomplish them, then provides you with a customized list of features. While this is also not a perfect solution due to technical complexity and potential learning curve, we might start to see a trend shift towards this direction.

    1. For example, consider the activity of driving a bus: it’s not just the driver that matters, but the dispatchers that communicate information to drivers, the other drivers on the road

      Relatable. When I was designing a course planning tool, I noticed in my design review that I hadn't really considered the needs of administrators or academic advisors, which are also crucial in the loop. You also see this kind of problem in a lot of existing tools where individual function might be very nicely designed, but it doesn't fit into the larger system. An example is Microsoft Tay, which failed to install proper safeguards, leading to extensive harmful exploitation.

    1. Seeking multiple perspectives on a problem (sometimes conflicting ones). There’s no better way to understand what’s actually happening in the world than to view it from as many other perspectives as you can.

      The comment about seeking multiple perspectives resonate with me deeply. When you envision a solution to a problem, it is easy to dive too deep into a single narrative and ignore the more wholistic perspective of the issue. When I have worked on projects, there were plenty of instances when I focused my energies on designing the wrong features simply because I failed to account for other perspectives. Had I talked to users more, I would've realized that some problems are simply annoyance that have no drastic impact to the actual user experience.