Emotion recognition systems are finding growing use, from monitoring customer responses to ads to scanning for ‘distressed’ women in danger.
Too bad it gets the emotion and not the context.
I’d love to be fired because “I hate making money for these greedy ass capitalist douchebags” pops up on a screen whenever I come in.
The idea that employers should even be allowed to ask what their employees are feeling, much less scan them to discern it, is a new low for our modern Orwellian dystopia.
The thing is though, I don’t see how someone like this could even work out.
Like, you hire employee 1, they get frustrated at something overnight. You fire them for being upset. Now you have to fill the seat. Employee 2 is brought on. They get told what happened to the person they replaced. They leave or are fired for having emotion and being human. This repeats ad nauseum.
Let’s be real, most of us would get weeded out at the interview when they start spilling all the “we’re like a family” bullshit.
I’m guessing it’s going to be implemented as identifying “persistent negative attitudes” and as validation to fire anyone in non-fire-at-will locales.
It could also be used as bullshit to deny raises and promotions if your grateful or motivated indexes weren’t high enough.
so, basically a tool to suss out which employees have undisclosed mental health issues that the employer can’t legally ask about. cool. cool.
It’s about time we start holding the engineers building these technologies responsible directly.
I’m not talking about scientists expanding knowledge, I’m talking specifically about the engineers building these technologies.
Is mood recognition a tool useful for anything other than maintaining power over others (actually curious)?
Seriously. Why are people choosing to work for these companies? There are other ways to make a buck. Have some fucking morals.
The threat of homelessness and starvation is quite coercive. That’s why people still work at these kinds of jobs.
At a certain point, not just the companies doing this are to blame but the people working for them as well. Who tf can support this kind of thing? People need to have some self fucking respect.
For example we could probably have the cure for cancer right by now if they spent half the effort on it as they did making unbeatable thc drug tests for example. It’s clear where society’s priorities are. Improving lives does not generate profit.
You can to an extent, but that’s a losing venture. If pubic opinion goes against this tech hard enough, it’ll keep some people from working in those industries. BUT if those products are profitable enough, they will simply pay more and that’ll be moot.
Attacking the people who are earning a living isn’t the answer. Most people take the job with the best combo of pay and work/life balance they can find in their area, or if they can afford to move. Not that many have the luxury to pick and choose based on their morality. And if compensation is high enough, it’s a lot less likely.
It’s far easier to try to prevent this tech from being used at all. I know political action is hard as hell but it’s a lot easier than trying to ostracize an entire industry’s worth of workers. It may feel easier to denigrate faceless individuals but that won’t accomplish anything. Plenty of people work for weapons manufacturers and such.
Plenty of people work for weapons manufacturers and such.
And those are bad people. If you work to build technology used to maintain power when you have an option not to, what else can that be called? These people are not desperate for a job.
I’m an engineer, I quit \ (after the startup I worked for was acquired) because Intel powers much of the MI complex. I quit \ when it became clear I was directly assisting with state level genetic experiments. As an engineer I could easily get a job elsewhere where I was not directly contributing to the downfall of my fellow humans.
Take McDonald’s for example. There’s a difference between someone who needs a job working in a restaurant and an Engineer working for McDonald’s figuring out how to more efficiently slaughter animals paid only to be concerned about their employer’s profit – that engineer could go work to more efficiently bake cookies.
These people are not desperate for a job.
You’re painting with a firehose. Some people are.
I’m an engineer, I quit Intel (after the startup I worked for was acquired) because Intel powers much of the MI complex. I quit Illumina when it became clear I was directly assisting with state level genetic experiments. As an engineer I could easily get a job elsewhere where I was not directly contributing to the downfall of my fellow humans.
You are what we call, privileged. Maybe you should…check it?
There are uses for it. They can track the average mood of an entire room over a period of time. If you use that somewhere like a restaurant, or a banquet venue, then that information can be useful for tweaking the policies, environment, prices, etc. Of course an actual human could do this too, just by being there. I think it’ll get the most use at places like casinos where they’re always using psychological tricks to make people want to gamble. Ironically I don’t think that “happy” is the mood they’ll be aiming for.
Ya, I guess I can see some uses for it, but nothing that makes the risks of it’s existence worth it.
It seems like every tool/tech will be used by good people to do good things and bad people to do bad things. Some things like a spoon are handy for getting good things done but not very useful to bad people to do bad things with. Other tools like mood recognition might be quite handy for bad people looking to control others, but only moderately useful to good people.
Tools in that second group I think we should be wary of letting them exist. Just because something can be done doesn’t mean it should be done or that it can be called “progress”.
It has already existed for a decade or so. I’m surprised it hasn’t made headlines before. I saw a working demo of it at the Microsoft Visitor Center about 8 years ago. In addition to estimating your mood, it also assigns you a persistent ID, estimates your height, weight, eye color, hair color, ethnicity, and age. It is scarily accurate at all of those things. That ID can be shared across all linked systems at any number of locations. I completely agree with you that there are a lot of concerning, if not downright terrifying implications of this system. It’s a privacy nightmare.
Is mood recognition a tool useful for anything other than maintaining power over others (actually curious)?
If you ever want a real General AI then it will need the ability to recognize the mood of the person it’s interacting with. ESPECIALLY if you want to use it for things like Mental Health Counseling.
deleted by creator
Mental Health Counseling.
Thanks, that’s a valid answer like I was looking for. Though we don’t have actual AI and probably won’t have actual AGI for at least a good decade (we currently have machine learning and complex decision trees which appear kinda intelligent to us in 2023).
This needs to be shutdown. It’s the most dystopian thing I have ever read.
What’s crazy is that this was already fully functional and in-use at least 8 years ago. Idk how this has stayed out of the headlines until now. Microsoft had a working demo of this in their visitor center in 2015 and was already using it in multiple places. As soon as you enter the room it assigns you a persistent ID, estimates your height, weight, eye color, hair color, and age. Then it tracks your mood and the overall mood of the room continuously. The ID can be persistent across any number of linked locations. They don’t ask for anyone’s permission before using it.
I DON’T NEED TO BE/ACT HAPPY TO GET MY JOB DONE
I hope they keep a lot of drive space for the depression folder.
deleted by creator
Sounds like you are fighting on behalf of the whole world. I hope you get some times with yourself or a smaller circle that are positive and a break from the dumpster fires of modern civilization.
If they could do that, they would probably see how God damn miserable most people are. If they used that to change and make them not miserable, I don’t see it being dangerous. But more than likely it will be more “your sadness doesn’t vibe with us. You’re fired.”
Everyday we get closer to a cyberpunk dystopia.