Mar 26, 2018
790
For some reason I had a dream about this. Was Sony created and had an actual game to show off the helmet. Metal Gear with giant robot human like enemies. Then they had a live demo station and I wore the helmet. Nothimg like VR. More like a starship trooper and flat see through front with red read outs and information. Enhanced audio. I have no idea, but dont read reset while sleeping. 90 dollars and couldn't make it before. Too large and expensive... Will leave this here in case any of this ever happens.. Doubtful
 

Tobor

Member
Oct 25, 2017
29,138
Richmond, VA
The cynic in me thinks this is devious marketing designed to build consumer interest in Hololens. There's a whole market of idiots that willl want this thing just because the military is using them.

Probably not, but you never know these days.
 

Biggersmaller

Banned
Oct 27, 2017
4,966
Minneapolis
Not that I disagree with you, but what OS do you think the US military has used for decades? Hint, I doubt it's OSX

The military using Microsoft's near ubiquitous OS is different than the pentagon straight cutting a $500 million check to weaponize HoloLens.

Regardless, this is about supporting employees who protest their project being weaponized. The sentiment that their complaints are invalid because they "should've seen it coming" is an odd takeaway when these people are simply taking reasonable action to follow their conscience.
 

SpinierBlakeD

Attempted to circumvent ban with an alt account
Banned
Oct 28, 2018
1,353
Not necessarily, no. I could be pedantic and describe how, in the general sense, "increased lethality" does not strictly mean "kill them before they kill you," or be less pedantic and describe scenarios like the adoption of the 5.56mm round increasing lethality without appreciably increasing troop safety, or I can be even less pedantic and point out that actually, eliminating the enemy can entail multiple things (which is why the word "casualty" even exists) and going for making soldiers more lethal is just one kinda messed up way to do that.

Can I also just point out that the leap from "if our troops can eliminate the enemy before the enemy can eliminate them they'll be safer" is both a shoddy premise and a bit of a fallacy? That is based on the premise that, given these augmented capabilities and the alleged force multiplication behind a targeting system, soldiers will be sent on assignments of similar risk level to those of today. There's no guarantee that's the case. It's based on the premise that soldiers will be safer because the primary cause of death for soldiers is direct confrontation with enemy fire. Given the state of Interceptor armor these days, that's not really the case. IIRC most deaths are due to shrapnel from explosives. Given that their own briefing talks about detection, but the best they can promise is more lethality, I'm not hopeful for soldiers being able to, say, have facial detection at checkpoints for suspicious persons or better tracking of noncombatants. If we wanted to assume that, we wouldn't even be taking the military's word for it.

Here's a counter-intuitive thought-- making soldiers better at killing might be unhealthy for them mentally. Desensitizing soldiers to killing might be unhealthy for them too, and since drone pilots can get PTSD based on what they do, it's hard to claim that long term mental health benefits are the focus. The gameification of war is also a valid ethical concern and may harm soldier safety in the long term. If you want our troops to be safe, you want them to be used sparingly and you want the participants of war at home and abroad to understand the severity of what they're committing to. Making war more like a game runs counter to these goals-- and I can't stress this enough, it might not even help with peoples' mental health. It's still a "game" where your friends can die messily and you'll be shoved into stressful situations for hours.

I can see why the people who design this would be horrified. With inventions like computers and operating systems, there's no real need to assume your operating system will help cause loss of life. I guarantee you actual computerized weapons of war aren't running Windows Vista. There's a genuine difference between a logistical tool and a device that will actively facilitate killing without the prior knowledge of its inventors.
Wow, great post. You made some awesome points that really change the way I see it. I definitely agree with you on the gameification of combat. I went to Marine Corps Officer Candidate School two years ago and a major part of the school was desensitizing you to war. You'd have to scream kill before taking pretty much any action, carry your rifle around with you everywhere you went. Instructors were always talking about their desire to "hook and jab" with the enemy. When we did our MCMAP (Marine Corps Mixed Martial Arts Program) training we had to scream kill and we practiced bayonette strikes. Then we did pugal stick fights with each other in front of crowds of cheering people.

And I get it, on paper. War is a messy business. People die, and as a cog in the machine you really have no say in the matter. Aside from WWII, wars are often fought for super shaddy reasons. Whatever your political allegiance, you have to serve at the descretion of the Hill. It was something that was talked about a lot at OCS. You don't serve the President, you don't serve Congress or the House. You serve your country. And a lot of times your country sends you into some pretty iffy ethical battles. Demonizing the enemy is an easy way for soldiers to cope with what they're doing. So trivializing killing makes it easier.

The problem with this is that is also attracts certain undesirables to the cause. The military preaches honor and valor. They say there's no greater deed than serving your country. You want to believe everyone is like Captain America. But in reality there's a lot of scummy people who just want to kill people. That's not to say everyone is like that, or even the majority of people. But they exist and it's an unfortunate byproduct of the aforementioned training technique. Apologies is I've rambled, your post made me have feelings. Lol
 

Justsomeguy

Member
Oct 27, 2017
1,721
UK
Is there any part of Hololens that was designed specifically for defence applications? Honest question.
None as far as I know (for v1. No insight into v2). Which is the problem really... It's not like MS is a firm which builds things for the military generally. So a normal dev there wouldn't ever assume that's where their tech would end up. (source - worked there for a decade until recently).
 

hibikase

User requested ban
Banned
Oct 26, 2017
6,820
I wish those guys the best, but I'm not too optimistic. For me this situation would be cause enough to make me resign, but not everyone has this luxury.

Read some of this thread and it's pretty sickening how many people here don't understand the difference between making software that the military happens to use because of its general purpose, and making software with specific military purposes. Not that this surprises me though.
 

Deleted member 283

User requested account closure
Banned
Oct 25, 2017
3,288
The military has used video game type training systems since around the 1960s. This is nothing new. The main purpose is to desensitize soldiers to killing the enemy in combat. Shoot/don't shoot simulations are a common thing in military and in policing.
I mean, if that doesn't say how disturbing war is, I don't know what does. You say desensitize, but that's just another way to say "stop seeing each other as human beings so you can kill each other better," which is disturbing enough in it's own right and even more so once you realize that desensitization training is unlikely to just vanish once the war ends or anything. Of course it's easy to reply something to the effect of "that's war" but that doesn't make it any less disturbing what's involved in it, and that it does involve those things, that certain individuals would be opposed on moral and ethical grounds.

Can definitely understand why these developers wouldn't exactly want to be a part if that, especially when the product they designed was meant for civilian, peaceful use, absolutely nothing close to that, to be retrofitted into such a use instead without their knowledge or approval or anything, to have their hard work go to such a purpose, yeah it's only natural that some people would be upset and concerned by that and I can't blame them at all.
 

Deffers

Banned
Mar 4, 2018
2,402
Wow, great post. You made some awesome points that really change the way I see it. I definitely agree with you on the gameification of combat. I went to Marine Corps Officer Candidate School two years ago and a major part of the school was desensitizing you to war. You'd have to scream kill before taking pretty much any action, carry your rifle around with you everywhere you went. Instructors were always talking about their desire to "hook and jab" with the enemy. When we did our MCMAP (Marine Corps Mixed Martial Arts Program) training we had to scream kill and we practiced bayonette strikes. Then we did pugal stick fights with each other in front of crowds of cheering people.

And I get it, on paper. War is a messy business. People die, and as a cog in the machine you really have no say in the matter. Aside from WWII, wars are often fought for super shaddy reasons. Whatever your political allegiance, you have to serve at the descretion of the Hill. It was something that was talked about a lot at OCS. You don't serve the President, you don't serve Congress or the House. You serve your country. And a lot of times your country sends you into some pretty iffy ethical battles. Demonizing the enemy is an easy way for soldiers to cope with what they're doing. So trivializing killing makes it easier.

The problem with this is that is also attracts certain undesirables to the cause. The military preaches honor and valor. They say there's no greater deed than serving your country. You want to believe everyone is like Captain America. But in reality there's a lot of scummy people who just want to kill people. That's not to say everyone is like that, or even the majority of people. But they exist and it's an unfortunate byproduct of the aforementioned training technique. Apologies is I've rambled, your post made me have feelings. Lol

Hey, it means a lot that my post made you feel feelings, so don't feel bad about rambling. There's a lot of bad actors in the military-- not just people on the ground going in for an excuse to kill, but people in the brass who have all sorts of malicious reasons for doing what they do. Given the way we treat so many of our veterans, I never take for granted that the people in charge have the best interests of our soldiers in mind. When people see tech like augmented reality and the best they can come up with is making soldiers better killers, it's a big red flag for me.
 

gofreak

Member
Oct 26, 2017
7,845
Nadella's response:

http://www.gamasutra.com/view/news/...and_democratic_HoloLens_military_contract.php

Nadella, however, sees the situation differently, and in an interview with CNN Business explained the company wasn't going to stop democratically elected institutions from purchasing its tech.

"We made a principled decision that we're not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy," he commented. "We were very transparent about that decision and we'll continue to have that dialogue [with employees]."

The chief exec reiterated he was "clear-eyed about the responsibility we have as a corporate citizen" with regards to the unintended (and potentially damaging) consequences of technological advancement, and claimed he wouldn't allow Microsoft to take "arbitrary action."

"It's not about taking arbitrary action by a single company, it's not about 50 people or 100 people or even 100,000 people in a company. It's really about being a responsible corporate citizen in a democracy," he added.

Ducking the question... that stance isn't particularly scalable. Offloading responsibility to the voter... the voter could vote for many things, or put bad people into power.
 
OP
OP
spam musubi

spam musubi

Member
Oct 25, 2017
9,411
Nadella's response:

http://www.gamasutra.com/view/news/...and_democratic_HoloLens_military_contract.php



Ducking the question... that stance isn't particularly scalable. Offloading responsibility to the voter... the voter could vote for many things, or put bad people into power.

Ah, the typical "democratically elected leaders can't be evil/fascist" fallacy. That really isn't the point of the protest of the employees, obviously what they're doing is legal, but the employees don't feel it's moral. Just because you can develop a technology for someone, doesn't mean it's right. Disappointing response that totally dodges the issues.