Humanyze is a data analytic company. Its mission is to monitor employee’s work day through the use of digital technology. Designed at MIT, The Humanyze Badge Platform allows employers to monitor their employees every second of every minute of every hour whilst employed in their company, collecting all kinds of data related to their work and personal and professional communications. Humanyze’s ideas are nothing new. The digitization of Taylor’s flawed theory of ‘scientific management” is already used by corporate technology companies (Google, Amazon, Facebook etc.) to gather personal data, bother them with annoying, irrelevant advertising as well as to monitor their employees. But now it has gone mainstream in the workplace.
Taylor (1856-1916) was a determinist who believed that everything a human is and does can be measured or quantified. Taylorism attempted to analyze and measure workflows in the early days of manufacturing and corporate industrialism. I recall it in the 1970’s while working in a company who instituted a “Time and Motion Department”. It wasn’t to monitor the washrooms. Its main focus was on measuring product output and meeting targets-if the target was met employees were doing their job-if it wasn’t the people from the ‘Time and Motion’ Department marched in with their clipboards, observed the means of production, took copious amounts of notes, then departed. A week or so later a new machine or a few more employees appeared or disappeared.
While Taylorism diminished as a theory in the quarter half of the 20th century, other theories grew around it-these incorporated the deterministic ideas of scientism along with behaviorism, logical positivism and a crude form of rationality theory.
Humanyze has taken these de-humanizing theories and turned them into a kind of digitized quasi- moral argument for protecting both employers and employees through “a commitment by a company to continuous improvement” (Humanyze, 2016) while in reality it is the intrusive monitoring of employees in the workplace.
Humanyze’s Badge platform, by its own admission “is at its most powerful when adopted company wide” (Humanyze, 2016) It is a somewhat naïve and cynical argument to claim that the badge “empowers employees to benchmark themselves against career path goals and take actions to achieve those goals” It is misleading. In actuality all kinds of data may be collected about employees without their full knowledge and informed consent. They may not know the nature of the data collected and how it will be used.
Humanyze is the digitization of the Panoptican (Bentham 1748-1832). The concept of Humanyze’s design is to allow all employees of a company to be observed and monitored by a single person or piece of technology without them being able to tell whether or not they are being observed or monitored.
In Bentham’s time it was physically impossible for a single person to observe and monitor everyone at once, the fact that those in the Panoptican could not know when they were being observed or monitored meant that everyone had to act as though they were being observed and monitored all the time. Humanyze’s product allows for the continuous monitoring of employees and the mass accumulation of data on every single employee of a company. The sinister ‘buy in’ sought from employees according to big data analytics, is that they have access to their data. However, they’ve no control over how their data is to be used.
Humanyze’s products are marketed “to leverage internal digital communication and to identify risks within their organization” (Humanyze, 2016). This is simply the obfuscation of the real goal which is to gather employee data and use that data to bring to realization the Orwellian concept of compliant citizen workers. The risks to an employee’s privacy are subjugated to the crude theories of managerialism whereby “Managers can proactively understand disruptions to their teams or can be warned of potential project failures based on communication gaps and senior leadership can understand the behavior profiles of high performing teams and target training to raise the performance of all teams” (Humanyze, 2016). The potential misuse of data and the ethical consideration which should underpin the mass gathering of employee data (or anyone’s for that matter) are missing.
Humanyze declares unashamedly the extent to which an employee’s privacy will be invaded: “As part of the Digital Platform, Humanyze offers fully automated extraction services to enable ease of deployment. Our extraction tool, DGGT (pronounced “dig it”), will allow your technical staff to configure and automate extractions for most major email, calendar, and chat platforms (Humanyze, 2016).
Humanyze hasn’t considered all the legal and ethical implications of its products. For example there’s the potential for discrimination. The use of data analytics by the public and private sector may be used by governments and companies to make determinations about our lives and our own right to self-determination. The use of predictive analytics makes decisions and judgments about people and will have a negative impact on individuals because it is devoid of any value based social communicative process. Humanyze potentially and in all likelihood legitimizes covert discrimination of employees through its data analytics. It will be very difficult for any employee to detect and prove they are being subject to any kind of discrimination based upon their age, race, creed, color, sex, national origin, religion, sexual orientation, gender identity, disability, marital status or socioeconomic status.
In addition there is the potential for massive breaches of personal data. Already we’ve seen the exposure of millions of employees and private citizen’s personal details through hacking. (Armerding, 2014).
There is also the risk of big data being used by second and third parties for research purposes without legal and ethical consent from those whose data has been collected. It isn’t possible to securely anonymise all data. Individuals and groups can always be identified.
If Humanyze’s products are to have integrity then individual employees should have control over what data companies collect on them and how it is used.
Unlike Europe and the UK, data protection in the United States is complex when it isn’t used randomly by Government agencies and their surrogates. Sotto and Simpson describe US data protection laws like a ‘patchwork quilt” (Sotto & Simpson, 2014) and reading through the complex laws at a federal and state level it seems that “…in regulated contexts…individuals are provided with limited choices regarding the use of their information”. This is perhaps something everyone knows-but a digitized Panoptican isn’t going to reverse this or for that matter change anything in favor of the individual in the foreseeable future.
References
Armerding, T. (2014, December 8). The 5 worst Big Data privacy risks (and how to guard against them). Retrieved from CSO: http://www.csoonline.com/article/2855641/big-data-security/the-5-worst-big-data-privacy-risks-and-how-to-guard-against-them.html
Humanyze. (2016, September 14). Humanyze: How It Works. Retrieved from Humanyze: http://www.humanyze.com/products.html
Sotto, L., & Simpson, A. (2014). Data and Privacy Protection. London: Law Business Research.