Am I An Asshole App

 
 
 

Lately, I’ve been full of ideas but empty on follow-through. To break that cycle, I’m going to write one down.

I present to you an app called “Am I An Asshole?”

This wouldn’t be the final name, but the concept stands. This app answers one question: How am I acting right now? It analyzes your communication to tell you if you’re being rude, condescending, inconsiderate, or otherwise an asshole.

Some people clearly need this more than others, but we all have our moments. Whether you're talking with family, navigating a meeting with coworkers, or chatting with a stranger, there’s always an opportunity to cause unintended offense.

The value lies in real-time feedback. Reflection on past events can be useful, but we often forget how we felt at the time. Catching a mistake as it happens allows for immediate course correction and a better understanding of the triggers that lead to poor behavior. By surfacing these signals early, the app gives you a chance to pause, recalibrate, and respond more thoughtfully before a conversation spirals.

Beyond simple “asshole detection,” the engine analyzes a spectrum of behaviors such as defensiveness, snark, sarcasm, or aggression. Therapists could integrate this tool into behavioral programs such as anger management, notifying users the moment their speech patterns shift toward hostility.

Conversely, the app provides positive reinforcement when it detects supportive, empathetic, or constructive communication. Over time, this feedback loop could help users develop better conversational habits.

Think of it as self-awareness as a service. A digital angel on your shoulder.

Design

Ideally, this app integrates seamlessly into your daily life, helping without getting in your way. It should alert you to behavioral shifts with enough subtlety that you can adjust your tone without breaking the flow of a conversation.

Inputs

The app works by processing environmental inputs like audio and text through an emotional analysis engine. To be effective, the system would need to monitor all your speech and digital communications.

If you think this sounds like a privacy nightmare, you're absolutely right. This level of data capture represents a surveillance capability that would be the envy of any authoritarian regime. I’ll address these concerns in the section below on challenges.

Real-Time Notifications

When your behavior crosses a specific threshold, the app triggers a notification.

This notification could come in a few different forms. If you’re having a conversation in person, the ideal form factor might be smart glasses or an augmented reality device. This would allow for the most seamless usage and discreet feedback. A subtle hint in your peripheral vision would warn you that your tone is shifting. You could also get a haptic notification on a smartwatch, but it would distract from the conversation.

 
 

If you’re sending a message to someone, the app could notify you that you’re behaving in an undesirable manner. Similarly, if you’re on a video call for work, the app would notify you that you aren’t listening properly or that you just dismissed your coworker’s idea.

 
 

Heads Up Display for Continuous Monitoring

Real-time notifications can be useful, but you might also want a more detailed view of your current emotional state. The app can also show you a real-time visualization of your current disposition and the intensity of specific traits.

Analyzing Others' Emotional States

Another feature worth exploring is the analysis of other people in a conversation. If you’re talking with someone who is suddenly agitated, angry, or annoyed, you can adjust your approach or offer support.

Historic Behavioral Analytics and Patterns

Over time, the app can aggregate data to identify long-term patterns. You might be more irritable on Monday mornings or more relaxed when working from a specific location. Maybe you’re more patient or empathetic when speaking with a particular person. The app can display your history and highlight interesting patterns over time, helping you understand the triggers and contexts that affect your behavior.

Configuration

You should be able to configure the app to assist you only in specific contexts. For example, you can enable analysis in particular apps like Slack, text messaging, or Zoom. You could also enable or disable the analysis of other people in order to respect their privacy.

Ethical & Technical Challenges

The path to “self-awareness as a service” is paved with significant ethical and technical hurdles. To move from concept to reality, the system must address the friction between utility and risk.

Privacy

The app’s effectiveness depends on total environmental awareness, which creates an inherent privacy problem. To function as a true real-time mirror, the engine requires access to every word spoken and every message typed. Even with a local-first architecture where data never leaves the device, the sheer value of the information makes it a high-impact target for security exploits. Privacy and security would not just be features, they would be fundamental prerequisites for the product’s existence.

Accuracy

Human interaction is governed by nuance, subtext, and culture. Identifying the thin line between harmful aggression and playful sarcasm remains a significant challenge. False positives could undermine trust in the product, especially in emotionally sensitive moments. 

Thankfully, the app could leverage established tools such as Hume AI and openSMILE to build on a foundation of existing research rather than reinvent the wheel.

Authenticity Suppression

If you're constantly monitoring and adjusting your behavior based on algorithmic feedback, are you becoming a better person? Or just a better performer? Genuine emotional growth typically involves understanding why you feel and react certain ways, not just suppressing surface-level expressions.

An over-reliance on external feedback could short-circuit the deeper introspective work that leads to lasting change. Users might learn to game the algorithm rather than genuinely improve their emotional intelligence.

There's also the question of what it does to human interaction when we know we're being constantly evaluated. Some of our most important conversations involve expressing difficult emotions. A tool that's always nudging you toward “pleasant” behavior might suppress legitimate emotional expression and authentic communication.

The goal should be helping people communicate better, not just nicer. Sometimes being direct, even at the risk of seeming like an asshole, is the most ethical choice.

The Self-Awareness Catch-22

The people who need this most may be the least likely to admit they have a problem. Marketing an app as “Am I An Asshole?” appeals to the self-deprecating and the self-improving, but the true assholes may require an external incentive. If you were to truly develop this app, you would have to find a more marketable story.

Form Factor

Continuous behavioral analysis requires an uninterrupted stream of communication. The success of this product is linked to the mass adoption of ambient hardware such as smart glasses or AI wearables. However, an initial desktop-first deployment could target professional ecosystems like Slack, Zoom, and Teams.

Operational Costs

Running the analysis locally may put a strain on device batteries, and running in the cloud may be prohibitively costly at scale. Cost management would be critical for anyone seriously exploring the idea.

Conclusion

You may think this idea is absurd, invasive, or even dystopian. That reaction is completely understandable. This is essentially a surveillance state masquerading as a consumer application.

However, our society suffers from rampant miscommunication. We often hurt others without intending to, simply because we cannot see ourselves clearly in the heat of the moment. What if better self-awareness tools could genuinely reduce everyday friction, resentment, and miscommunication? Even marginal improvements at scale could meaningfully improve social interactions.

Call it crazy, but think about all the assholes in your life. Don't you wish they had this app?

Next
Next

Spectator