Is It Ever OK to Break Into Your Colleague’s Computer?
Rick was looking for Kelly, one of the analysts who supported his consulting team. Kelly’s hours were a mystery, and Rick was confused and annoyed. He tried to check her calendar online, but, after struggling to locate it, he solicited help from an IT specialist. They found all sorts of surprising weekday appointments at the gym, lunch with friends, and even horseback riding — all fifteen miles outside the city. They went to the department head who authorized a review of Kelly’s entry and exit records from the lower level parking garage. Kelly was a nuisance employee and maybe Rick had the goods on her now.
Steve, a portfolio manager for a major mutual fund company, understood that his company monitored internet use of all employees. The surveillance was even tighter for the traders, and an alert flashed on Steve’s PC when his trader, Eric, started surfing questionable sites. He warned Eric to cut it out, but the alert kept flashing. Eric was a great trader, but this posed a real difficulty for Steve — what should he do with the information he possessed?
I had been preparing a client presentation with several of my colleagues at our asset management firm. The night before the meeting, I reviewed the document on my laptop at home, noticing that several key pieces of information were missing. Early the next morning, I emailed the group to explain what we still needed, but one person was out and not easily accessible. We realized that the required data was only on his password protected computer. I didn’t know whether to unlock his computer or try to work around the problem.
The digital age offers us immense opportunities to hone our detective skills, to observe our colleagues in an entirely new, not necessarily flattering light, and also presents tremendous challenges in terms of how we use that information. Fingering the transgressor may be the most satisfying aspect to digital sleuthing, but managers must understand the negative impact of monitoring our co-workers. The more aggressive the data mining, the higher the risk of abuse, and the more supervisors must tread sensitively to re-establish any lost trust.
At large companies, employee handbooks warn us not to expect privacy protection for whatever content we write, read, watch, or send on corporate owned computing equipment, including our smartphones.
Electronic Performance Monitoring or EPM has become widespread at many corporations to track, count, and analyze employee keystrokes. An important paper by Laurel McNall and Sylvia Rochmakes very clear that workers, justifiably, worry about being watched, particularly when they don’t know how their bosses are using information.
Heisenberg’s uncertainty principle states that the act of observation alters that which is being observed. Observing your employees via careful digital sleuthing will similarly alter their behavior. On the negative side, it may create worker dissatisfaction, hurt morale, and erode trust. But it can also have a positive impact, if it promotes awareness of the public nature of our electronic presence and results in more thoughtful and responsible behavior. We need to learn how to harness the power of digital surveillance as a benign force.
George, the CEO of a technology company with over 150 employees, believes that you need to be careful when confronting someone with intelligence gained from data mining. Other than the case of an egregious act, the cultural destruction might well outweigh the benefit. Not only the accused but everyone else who hears about it will wonder, “What kind of company do I work for?”
Martha, who manages a staff of hundreds at a health care facility, says she uses a “root cause” analysis to decifer digital errors and distinguish between a mistake caused by a systemic failure or individual oversight. In addition, she conducts substitute testing, which recreates the exact problem, to determine whether another person would have made the same error. She believes that her staff understands and accepts that management uses monitoring and digitally-sourced information to improve institution-wide performance rather than to assign blame. As Devasheesh Bhave described in his study of EPM, the feedback offered is essential.
In the case of Kelly, who was taking horseback riding lessons on company time, Rick made sure her boss saw her garage timestamps. She lost her job immediately. The record of her comings and goings in the parking garage painted a clear picture of abuse. The consequence of surveillance fit the crime and it still sends the right message to everyone who hears about it.
Steve couldn’t sit on Eric’s website transgressions; the trader was also fired. Eric sued the company, maintained that he did not receive sufficient warning, that he received email links to pornography from other traders inside and outside the firm which he did not open, and that the monitoring system unfairly targeted his computer. Steve is torn. Eric was a great trader, but he has not worked in two years and, with a lawsuit pending, he cannot get a job. Steve worries that he might have jumped to conclusions and wonders if he should have given Eric a harsher warning.
I couldn’t bear the idea of hacking into someone’s computer, having never done something like that before. We managed to pull together most of the missing data, and the client didn’t notice the difference. We then implemented some new processes for timelines and safeguards which probably should have existed in the first place. The incident forced us to consider important questions about intra-office transparency and the extent of privacy at our firm. I wonder, however, under what circumstances I might make the opposite decision — to unlock the computer.
The more we can sift through and massage the troves of data accumulating on our servers, the greater the need to understand how to use that output, the positive and negative implications, and how to craft and deliver the appropriate message to our colleagues.