Senate_Commerce.59dfd43084054.jpg  

                                                                         Washington DC
                                                                      December 14, 2018

Memorandum

To:                           Sen. John Thune (R-SD);  Ranking Member Sen Bill Nelson (D-FL)
From:                     Brian H, Chief Technology Advisor;
                                US Senate Committee on Commerce, Science, and Transportation
Re:                          Proposal of the PRIVA-C Act, to be introduced in the 116th US Congress

___________________________________________________________

Issue:    
The United States faces challenges in determining when and where regulation is appropriate for platform and new media companies such as Google LLC, Facebook Inc. and Amazon.com, Inc. The issue will be a priority for the 116th US Congress, in session from January 3, 2019.
Proposal:
   ‘Platform Reform and Information Verification for American Consumers Act’
                                                                         PRIVA-C ACT
In preparation for congressional debate surrounding the drafting of regulatory legislation for platform technology companies such as Facebook, Inc. and Google, LLC, this memorandum will serve to outline some relevant background and analysis surrounding the issues, chief among them being; Should the United States regulate such companies. Is it the role of the federal government to impose restrictions on private tech-enterprises in the free market? If so, what warrants regulation specifically? Where is the balance between protecting the sovereignty of free market principles, and mandating guidelines to maximize and protect the public good?
New media and technology sector enterprises must be able to continue to foster the innovative spirit and practice that has allowed the United States to be a world leader. We must ensure that America’s technology sector is not over burdened with regulation and is able to freely trade and compete in the market. That being said, as the technological state of the world changes, so too must norms, policies and law. Indeed, technology itself cannot maximize social good nor ensure social privacy, these are things that can only be achieved through a combination of technology, policy, law and innovation.
1 – SCOPE
______________________________________________________________________________
Assumptions:
  • Data is political and regulation poses political challenges as much as strategic and technological ones.
  • Regulations are not solutions searching for a problem.
  • Self regulation would ideally be sufficient to address the issues we face in data privacy and information security. Events have shown, however, that self-policing will not be enough to assure the American people of their privacy and that the free market simply does not provide the security and privacy that citizens demand.
Concerns:
Does an unregulated market reach a more efficient outcome?
  • Regulation does not necessarily hinder innovation
Though the American people have always prided themselves on their innovative spirit and their respect for the free market, they have also consistently understood the need for regulation when and where it made sense and benefited the common good. In fact, we have even seen that regulations or the imposition of development and operational parameters in many cases has led to more not less innovation.
Platforms are a market that brings together buyers and seller of services. They have built in standards that allow others to come and build upon them and add functionality. Often this has huge implications as the world has seen with Google Maps. In many cases, platforms have been shown to significantly lower the barriers to entry for new and innovative products to bring their ideas into fruition and into the marketplace – Amazon Web Services (AWS) is a comprehensive example of this effect. It is no longer necessary to secure the investment of millions, or even hundreds of thousands, of dollars in order to build new software enabled products from the ground up. Entrepreneurs can now accomplish the same goals with minimal investment, with a much smaller team and within a much smaller time frame, something which, apart from an agile approach of constant iteration and testing, is inherently necessary to being competitive in software.
One of the problems we face in government has been a failure to recognize the utility of building upon such services in the public sector. For analogy, it is as if there is a highway from Washington DC to Boston in existence when a need arises for a new highway to New York. Platforms allow for the route to share half of its journey with the existing highway, and only requires an exit and bridging road to New York, rather than a full new highway. Those who control platforms will have disproportionate relative authority in the future and the government has the duty to regulate and be involved.
The need for transparency and accountability
Events over the last few years have led much of the American public to no longer believe it is a question of whether or not to build regulatory frameworks for the big tech firms industry, but where specifically, and more importantly how. Conversations around the need to regulate the sector grew in ferver after events such as the interference from Russian security services in the 2016 United States elections and large-scale data breaches resulting in the loss of personal information belonging to millions of users (American and other) on entities such as Facebook and Equifax.
Built-in Security
Rather than sitting in a building such as you are which has come into existence through regulations and laws (for example requiring architects and engineers to have qualifications and certifications in order to design and build), the current state of unregulated platforms allows for what in effect is a likened to a system of unlicensed architects and builders accessing the tools of construction via an app, and iterating their way through to making a building. They are not designing security from the outset as one would in physical space, rather they are building and patching as they go. This is an obvious reason again to call for security regulations for platform services.
Rationale:
  • Regulation can not be an afterthought
Regulations tend to be introduced as an afterthought. They are often a reaction that arises after an event to prevent it from happening again, rather than preventing the event itself (‘left of boom’). On a departmental scale, one can look to the establishments of the position and office of the Director of National Intelligence, the Department of Homeland Security itself and the USA Patriot Act as reactions to the attacks of September 11, 2001. Similarly, one could cite the establishment of the United Nations and the rules, norms and regulations that came as a result as a reaction to the follies and failures of the international system that led to the Second World War.  Looking across society or industry it is not difficult to find examples which highlight this notion, from seatbelts arising after a threshold of societally acceptable crashes had been passed to FDA regulations on cigarettes, and more recently vaporizers and smokeless tobacco products, finally coming into existence only after the harm became quantifiable. What this points to in the main is a failure of imagination on the part of the governors of society, and to an ‘ask for forgiveness rather than permission’ / ‘unless the law says I can’t, I can!’/ ‘move fast and break things!’ mentality in industry. This mentality is precisely what  has led America to be a world leader in innovation. However, as software begins to touch more and more of the physical world, the need for common sense preemptive regulations becomes stronger than ever before. No longer does a software glitch lead only to a crashed computer but to a crashed car.
Normative considerations:
  • To regulate or not to regulate, that is no longer the question
Tim Wu recently recently opined in the New York Times that ‘We fight to protect the means but neglect the ends’ in relation to our right to the pursuit of happiness. In other words, we take great pains to protect our system of free trade and entrepreneurship in the free market at all costs, yet we may not be as an aggregate society better off or happier with the outcomes. Respecting the free market and trusting platform services to regulate the flow of information themselves has not been successful strategy as we have seen in recent elections and with the deepening tribal fissures in American politics. Similarly, we have seen that in the event of data breaches or misuse, the incentives have not been great enough to compel platform companies to transparency and reform. The market simply rewards speed over security, and apologies coupled with explanations that such services are ‘new’ and developing have not led to any tangible solutions. The time to introduce comprehensive legislation is now.
2 – Goals of PRIVA-C Act
______________________________________________________________________________
  • Safeguard consumers’ privacy, and help to protect data from manipulation and misuse
  • Safeguard against the manipulation of platforms by foreign actors to subvert US elections
  • Ensure greater transparency in the use and trade of Americans’ data
3 – THEMES of PRIVA-C Act
______________________________________________________________________________
3a – Security (Content and data):
  • Platforms have a responsibility to:
    • Protect users’ data
    • Protect networks from unauthorized access
    • Monitor for and consistently eradicate bots and fake accounts (classified as ‘identity fraud’)
    • Ensure transparency of sources of information and funding on their services
    • Basic security protocols must be met in development before market access
Information: Facebook CEO Mark Zuckerberg in Congressional testimony this year stated that he sees his company as a ‘technology company’. Given that as many as 68% of the US electorate state that Facebook and other social media represent their main source of news and information, it is incumbent upon the leaders of these companies to honestly assess their prominent role in the information ecosystem, and take serious steps to maintain transparency and to be forthwright as is expected of other traditional news sources. They must recognize that they too are media companies. Social media giants have a duty to protect the information space that they create in the same way that newspapers do. Ensuring the authenticity of the information that proliferates on their platforms is a difficult task given the speed and scope, however as a minimum, they must provide information on the origins of pages and posts. In the same way that political television ads require attribution (‘I am candidate y, and I approve of this ad’, ‘Paid for by x’) so too must advertisements or propaganda of a political nature online. This ought to be a part of any new legislation. Social media companies have the legal responsibility to protect against hate speech. They must have the same responsibility to protect against ‘fake’ news by providing information on the funding and origin of posts as well as constant monitoring and deleting of fake accounts and profiles. Technology companies are not only responsible for the data they keep, but the content that is created and shared on their platforms.
Networks: Recent data breaches at sites such as Quora, and Facebook highlight the need for minimum standards on security, guidelines on going public with the information and accountability for the loss of data as a result of unauthorized access.
3b – Privacy:
  • Platforms have the responsibility to:
    • Be transparent about where and how their users’ data is being trafficked
    • Allow access to users’ data only under consent
Blockchain has been touted as an answer to the privacy issue. Proponents say that the technology will allow for the control of data to be with the users in the near future and therefore efforts towards regulation are unnecessary. The evidence remains theoretical. Regulations and laws under threat of legal ramifications, as well as the development of norms of behavior are the best ways to ensure user data privacy.
Platforms have a duty to provide transparency in regards to who buys advertisements on their services, but individual behavior must be treated differently and with more concern for privacy.
4 – PRIORITIES
______________________________________________________________________________
Reasonable expectations:
  • Security: Companies must ensure that they meet minimum standards of data and network security or face legal consequences, including fines and imprisonment.
  • Authenticity: Maintaining hygienic information ecosystems and requiring social media companies to abide by similar laws to those in the media landscape (attribution of political advertisements, citations for sourcing, transparency of post origin etc.)
  • Transparency: At a minimum, regulations must focus on increasing public visibility into the operations of platform companies as the stakes are too high. Ensure users have a say in the extent to which their information is shared with 3rd parties and data brokers.
5 – PRIVA-C ACT (content proposal for debate and iteration)
______________________________________________________________________________
**WORKING DRAFT**
  1. Rigid standards included in Bill
  • In alignment with regulations under the EU’s GDPR Legislation, 72 hour reporting timeline for data breaches to be codified in law.
    • Require companies to retain teams capable of forensic analysis on staff or by contract (similar to a DHS CERT team) so as to be able to comply with mandated 72 hour timeframe for reporting any and all breaches of data. The market rewards cheap and fast delivery of goods and service which are feature rich but did not necessarily have safety built in from the beginning. There is incentive not to secure services and to rely on insurance or public outcry to quiet down in the even of a data breach. This is unacceptable.  
  • In line with calls from Microsoft, the sharing of facial recognition and fingerprint data to third parties by law to become ‘opt in’
    • The sharing of facial recognition data, and biometric data such fingerprints or DNA (collected by services such as ‘23andme.com’) is only legal with the opt in consent of the owner. Biometric data legally classified as ‘property right’.
  • Individual access to personal data sharing history a legal right
    • The US Freedom of Information Act (1967) allows for public access to the exchange of public records. There is no legislation presently that mandates platform services to present a full record of the exchange of personal data if requested. The PRIVA-C Act classifies personal data as individual property and requires that full and detailed ledgers of data transfers and sales must be made available upon request by law.
  • Transparency on the origin of content
    • Geographic location data for posts on social media, as well as advertisements must be made available by default (users should not be required to ask for the information, it must be presented).
  • Transparency similar to traditional media for political advertisements
    • SNS companies such as facebook are not solely ‘technology’ companies, they are considered under law in the PRIVA-C Act as media companies and are subject to the same regulations on political advertisements. Advertisements of a political nature  must contain funding attribution and messages of approval from candidates for office.
    • Change to  47 U.S.C. § 230, a Provision of the Communication Decency Act to consider new media platforms not exempt from laws on common decency in publishing. There is accountability for the information shared on sites going forward.
  • Ban on weak default passwords for IoT enabled devices
    • Similar to newly passed California IoT law; ‘Information Privacy: Protected Devices’, PRAVA-C Act will similarly include language to nationally ban poor default passwords for devices capable of internet connection (the ‘Internet of Things’).
  • Data portability
    • In 1996, the US Congress passed the ‘Telecommunications Act’. Section 251 required  all local exchange carriers (LECs) to offer their customers telephone number portability. In the same way, the PRIVA-C Act seeks to give property rights to individuals’ data and allow for it to be ‘portable’ between services. This should serve to increase the incentive to protect data and also stimulate competition as if a customer is unhappy with one service, they may by law take all data out of the service and transfer it to another.
2)   Flexible Standards included in Bill (circumstance dependent, ie. environmental regulations)
  • Collected data must have first person consent
      • Wherever possible, opt in consent for the collection of data not vital to the execution of service must be obtained from the user
  • ‘Right to be Forgotten’
    • Circumstantially dependent, not a ‘right’ as is codified in the GDPR, but classified as a ‘privilege’ in US Code.
    • Within reason and outside of the event of criminal activity, services must delete historic data by user request.
3)   Liability and Compliance Mechanisms
  • Algorithmic auditing
  • Platform services will be required to provide access in certain situations to algorithmic information to ensure ethical and legal compliance
  • Failure to maintain transparency on:
    • Political content funded from foreign entities will be classified under ‘criminal’ law in the PRIVA-C Act.
6 –   TO PURSUE IN CONCERT
______________________________________________________________________________
    1. Sen. Mark Warner’s Public Interest Data Access Bill. Push for bipartisan support of this recently proposed legislation. It is hard to police the trafficking of individual private data when the walls are strictly protected by the platform. More transparency is needed.
    2. Call on individual states to push for regulation. As an example, if California passed regulations on automobiles that forced certain changes to Tesla models, it is likely that even though the regulations do not apply in other states’ markets, Tesla would begin to only produce vehicles compliant with the CA laws for cost concerns which vicariously regulates the product in other markets that may have been slow to regulate, like Nebraska or Oklahoma. This could be considered a ‘trickle down’ system of regulation.
  • Bipartisan legislation to fund nationwide social media literacy campaign. Do we live in a world of technological determinism, or is it society and policy that determine our technological progression? In either event, we must encourage:
  • Bringing technology into policy making
Designing and implementing effective regulations in the technology sector do now, and always have, required direct input and guidance from individuals and entities truly and deeply literate in the subject matter.
     2) Bringing policy into technology development
As technology ought not be regulate post hoc in a reactionary fashion, individuals or entities with relevant policy knowledge and experience ought to be a required and integral part of the development of products and services in the digital economy.
CONCLUSION
______________________________________________________________________________
The overall goal of regulation is to design frameworks for operation, incentives for behavior that maximizes public utility, and punishments for behavior that harms it. Regulations for the technology and social media industry are long overdue and as a result we have found ourselves in a state of constant ‘clean up’ where we are reacting to events rather than preventing them.
This memorandum should serve to give some background on the issues and to propose guidelines for thinking about them as regulations are proposed and debated.

Memorandum — Hon. Sawhney

  331px-Emblem_of_India.svg.png

                                                                 Republic of India

Office of the Secretary of Ministry of Electronics and Information Technology

 Memorandum

To: Hon. Ajay Prakash Sawhney – Secretary of Ministry of Electronics and Information Technology, GOI
From: Brian H, Snr. Policy Advisor
Re: Rise in the spread of disinformation leading to mob violence and fatalities; Background – Options

Dear Sir,

As you are aware, the spread of disinformation and the overall polluting of the online information space in our country has been on the rise in recent years. Over the last two months, to highlight the urgency of the matter, more than 20 innocent citizens have been victims of violence that directly correlated to an uptick in malicious rumor and online vitriol.

Your public statements, Mr. Secretary, have been passionate and consistent. Yet, the issue has persisted. With the continuing of violence creating increased public pressure on Prime Minister Modi, as well as the upcoming national elections of 2019, it is now more important that we take highly visible and effective steps to see a decrease in false news proliferation and its resulting violence.

I propose to you, sir, the following recommendations falling largely into two major categories:

Recommendations:

  1. Increasing media literacy > Banning services and access
  • Today, there are nearly 200 million users of these services everyday and for services like WhatsApp, India is the largest market, something that could certainly be an economic advantage. Services like WhatsApp and facebook are particularly important in rural communities as they connect them to the bigger national identity, politics and economy. No doubt, many votes needed in the next election will depend on the information flow only possible by these services.
  • Banning products like WhatsApp and Facebook altogether would prove disastrous to many informal economies on which much of our nation’s people depend. Besides this, it is nearly impossible to completely combat the phenomena this way as we have seen in other countries messages move into direct messaging on other applications, messengers and email services.
  • Encourage public awareness campaigns: Similar to a 14 million dollar program that Facebook endeavored on in the United States, we must require them to partner with local media in India to provide awareness. We should encourage acts like taking out pages in newspapers giving insights into identifying fake information, as WhatsApp has recently done. We must both punish companies for poor practices and reward them for good ones.
  1. Increasing pressure on apps and services to monitor and control the spread of disinformation and the tools for mobilization.
  • Use the law: The Government has issued warnings to WhatsApp CEO and internet (dot) org  VPChris Daniels as well to Facebook itself in the past. Earlier this year in the summer, union minister of law & justice and electronics and information technology Ravi Shankar Prasad demanded adherence to Indian law and for the appointment of a ‘grievances officer’ who would be based out of a similarly demanded local office in the country.
  • We should introduce formal expectations that require professional competency in positions of influence and relevance to this matter. Facebook recently hired Ajit Mohan (previously of streaming service Hotstar here in India) to run the India division of the company. This should be encouraged as Mr. Mohan is both experienced and competent on the matter, and familiar with regionally specific laws and customs. Issues of truth in online information eco-systems are globally relevant, however the most effective policies for ensuring the integrity of India’s information space will be designed with context and culture in mind.
  • We must increase the presence of law enforcement on these services. I recommend a team in the national police infrastructure to be dedicated to sentiment analysis online so as to better predict and mitigate risks of mobilization in light of any new false information outbreaks.
  • Increase pressure on companies: In 2016, our Government took a strong stance against Facebook for violating India’s net neutrality with its ‘free basics’ program which favored its own services. The pressure campaign may be a successful framework for inducing stronger control of the information forwarded and shred on its platform in going forward.
  • One positive result which received significant international attention arising from recent pressure was the removal of the ‘quick forward’ button from WhatsApp, a move which may have some impact. Additionally, WhatsApp deployed a new feature which labels ‘forwarded’ messages as such and makes forwarding capability limited to only 5 groups at a time, contrasting it with the previous 20 group capability. However, groups may still contain up to 256 members making an individual still able to forward messages to up to 1,280 people at a time.

To reiterate, sir, there is no foolproof one-dimensional strategy for this new and very serious phenomena. We must approach designing policy using all of the tools that we have at our disposal . With hope, the above frameworks for thinking through the issues may be of service to your decision making calculus.

Who is tracking me!?

There is a clear trade off that must be made between privacy and convenience. For me, I tend to feel more comfortable leaning towards sharing less and using less as it relates to social media and other applications, even if it makes life slightly less convenient. I don’t use Facebook, for example. The benefits I feel I used to get from posting something about my life in the end didn’t outweigh the needless anxiety of waiting for ‘likes’, or worse, the uncomfortable realization that that I was building a permanent digital record of my life. 

UBER In saying the above, I recognize that it is odd then that I am an avid user of the ride share app Uber. Uber similarly collects quite a lot of aggregate data and it gets to know a lot about its users (where they work, live and even who they know, ie. share rides with). Although I am not entirely comfortable with Uber either, the benefits have absolutely outweighed the privacy concerns for me an I use it regularly. This semester, for example, I have two days a week where I finish a course at the Kennedy School, and have another course down the road only thirty minutes later. Without Uber, I am not sure I would have been able to take both of these courses at the same time.

Uber sells the data I give it to brokers. This is a concern. But defining the broker and how they use my data is also important. Some of them may use it for advertisement and profiling purposes, and I am not so comfortable with that. However, there are a few cases where entities use Uber’s data for more socially altruistic purposes, and I don’t mind contributing to them. In the past, for example, the city of Boston announced a data sharing collaboration with Uber to help with ‘managing urban growth, relieving traffic congestion, expanding public transportation, and reducing greenhouse gas emissions’. Apart from this, Uber also helps around the country with Amber Alert Programs. Given that the data that Uber has about me is largely locational metadata, I don’t have feed it all of my innermost thoughts and feelings and it doesn’t advertise anything to me, I am relatively comfortable with it tracking my movements in the area and its convenience outweighs my privacy concerns.

Twitter I also use Twitter, in a limited capacity. I have what some call a ‘ghost account’ in that I have never posted anything and there is no identifiable information about me (screen name is random, no bio etc), I have no followers and I don’t engage with anyone. However, that is not to say that there is not significant personal metadata being collected, nor that it doesn’t ‘track’ me (I have only recently turned off the location data). Why I use the service if I seem to be so paranoid about it or not engage with it in the way it is designed is a natural question with a simple answer; there are just too many interesting academics and thought-leaders who use Twitter that are hard to follow anywhere else. An old Professor of mine, for example, uses Twitter prolifically and provides real insight on a few areas I am fascinated by. He rarely goes in TV and doesn’t actually publish all that often, save the rare New York Times Op Ed. I am able to see his multiple insightful tweets everyday, if only I bite the bullet and follow him on Twitter.

Still, I am perpetually conflicted. I am concerned that I have made an echo-chamber for myself and I am further worried that Twitter collects more data than I am aware. The same reasons that got me off of Facebook some time ago motivated me to engage with Twitter (which I started using much later) in a different way. I still consider not engaging at all though, and will give this more thought.

Concerns In speaking this week about this post with my friends and family, it seems that most of the people around me are not concerned about the data being collected on them by their social media apps, they only worry about how it is stored and whether or not it is safe. I am the opposite. In general, I find myself feeling uncomfortable with social media companies collecting my data, and I do my best to not feed too much information (that which I can control…like my lunch choice…) to them. However, I am not particularly worried about the security of my information on their servers. I feel confident in the security of theses services, and assume they are quite safe as they have some of the best of the best working on data protection and their whole business models depend on it. 

Advertising is one thing, but the Facebook / Cambridge Analytica saga this year really put the issue front and center. This will be a continual battle of convenience versus security and I’m sure all of our behaviors and decisions on which services to engage with, and what to give them, will be subjective and ever changing.

Determination on LastPass for Harvard Kennedy School

As CIO for the Harvard Kennedy School, I have the unique responsibility of looking at best practices across industry, academia and government for best practices in maintaining the integrity and security of the school’s networks, as well as the data privacy of both the students and the faculty. Designing policy towards these ends requires multidimensional thinking as the causes for network intrusion are various, and the stakes potentially high.

With that in mind, I have recently been tasked to make the determination as to whether or not the use of password management software, such as LastPass or 1Password, should be made a requirement for all HKS students and staff.

In threat modeling for the HKS network, there are clear risks to the system in the forms of tampering, information disclosure and elevation of privilege. If an unauthorized user accesses the system, they have access to a wide range of services and networks. It is difficult to get in to the system, however it is not difficult to navigate laterally within it once it has been breached. This is a problem. However, I do not deem this to be a problem that is solved by the imposition of regulations that require the use of a password manager for students.

As such, I have decided not to make such a recommendation, and the reasons are as follows:

  1. Centralized password repositories are prime targets for malicious hacking activity. 
  2. Password managers are designed with convenience in mind more than security.
  3. Two factor authentication is required to access ALL HKS systems, requiring individuals to protect their private internet accounts with a password manager is an infringement on their individual rights and preferences.

Prime targetsThe facts is that the Kennedy School is a prime target of hacking. The school trains and produces future leaders in both the public and private sector every year. The fact that an adversary may wish to mine the data of a current HKS student who one day may become a public official, in hopes of one day having the capability to blackmail them, for example, is not as far-fetched of a concept as one might think. As well as this, much of the faculty at the school maintain highly placed ties in the US Government and in industry around the world. The theft of their data can be used in nefarious activities throughout many other networks and could cause severe damage.

With these realities, it seems to be a foolish security mechanism to contain all password data in the same place, with or without sophisticated encryption, and with or without each password being randomly generated by the application. Why one would do so must surely be for the purpose of convenience, which is not within the purview of IT administrators to govern or regulate.   

Convenience > SecurityStudents understandably experience difficulties maintaining and remembering their passwords as they are numerous across internet services. It is not common, nor proper, any longer for people to use the same password across services, and it is not uncommon for individuals to store passwords either in a word document of something of the sort on their computer itself, or to auto save passwords in their browsers. Both practices are unsafe, and the appeal of convenience of password managers is understandable. Password managers take great pains to protect the data given to them by using sophisticated end-to-end encryption. Of this there is no doubt. Unfortunately, however, the fact still remains that centrality is an inherent security vulnerability and that the most effective security measures are of distributed storage and alternative means of cataloging data (even securing them in physical locations that only the user knows of). There is always a balance between security and convenience and one must make up their own mind on where on the spectrum they wish to live. 

Infringement on individual rights and preferencesI believe it to be in line with Harvard’s ethics that we do not make anything mandatory that infringes on individual students’ preferences for PII management. It is perhaps acceptable to mandate that all students must use LastPass to manage their Harvard account passwords, however there simply aren’t enough Harvard accounts in general for this to be practical (it is not difficult to remember passwords if there are only a few ie. HarvardKey or XID), so it is not a practical policy.

Alternatives to solve the problem:

  1. Require two-factor authentication for any and all products or services that interface with HKS networks or store HKS data.
  2. Require password changes every semester for Canvas and Harvard Email.
  3. Implement a ‘cyber awareness’ training module that is required at the beginning of every semester. It would consist of a 30 minute interactive multimedia program where threats such as phishing would be simulated. to raise awareness and set norms. Overall, having a better cyber awareness amongst the student body would be an effective, yet simple, policy.

Conclusions:

  • LastPass and other password manager software and services are a largely safe and certainly convenient way to manage your passwords across services, randomize your passwords to be more complicated than is practical to retain in memory and conveniently access them through a single master password, which can even have 2 factor authentication to it itself. As a single unauthorized intrusion into the HKS network can cause severe damage if the user is significantly knowledgable or skilled, it is in the interest of HKS to do everything it can to ensure there is no unauthorized access to, or distribution of, students’ passwords.
  • That being said, it is in my judgement that as CIO I can not make the use of such services mandatory as it stretches into broader online lifestyle decisions which must only be made by the individuals themselves, not by the school.
  • We will continue to take security very seriously at HKS, and  we will continue to pursue other avenues of maintaining the integrity of our networks and data.

Memorandum – GaaP in Mass

                                                       Memorandum 10-05-2018

              To: Hon. Charlie Baker, Governor, Commonwealth of Massachusetts

              From: Brian H, CIO, Commonwealth of Massachusetts

              Re: Proposed framework and feasibility of Government as a Platform

The purpose of this memorandum is to briefly outline Government as a Platform (Gaap) as it works in concept, and to provide you with a framework for the option of adopting the practice in Massachusetts’ public infrastructure.

Traditionally, the relationship between a democratic government and the public it serves is transactional, rather than collaborative. The government determines the services it will offer based on input from its people as well as what it itself deems to be in the public interest. The public pays taxes and receives those set services through the development and delivery mechanisms of the government.

GaaP is a more inclusive and adaptive system that has the potential to increase capability and center the relationship around the user (public). It is a system by which government agencies and departments enable the people to use public data and provide infrastructure for the development of tools and services that they want to get out of their public institutions.

For context, a common analogy is the Apple iPhone. When it was initially released, it had a set 16 apps which were designed internally by Apple. Customers purchased the phone and used its applications as installed. This is the traditional government model.

When Apple decided to make the iPhone a platform and the tools for development were made available to the public so they could make their own applications and have them be available to all Apple users around the world, they transformed the iPhone from a product to a mobile platform and vastly increased its capability and appeal. This is the goal of Gaap.

Democratic governments have a responsibility to represent and interact with the public they serve. What is sure in the coming years is that governments and citizens need to speak the same language. As industry and the general public fully utilize technology (in this case referring to things such as collective intelligence, crowdsourcing, cloud computing, social media and the like), so too must the government utilize these services.

GaaP already exists and is successful in other countries, as well as in this one. Indeed, opening up weather and geographic data collected by the government for the public to build tools with has been successful in the United States for decades, and the successes of the concept is relatively accepted on both ends of the political spectrum.

In the case of Massachusetts, there is much room for GaaP to be adopted. In order to do so, however, there are some criteria that needs to be met. For example, there would need to be an iterative investment cycle. The proposed $1B USD would likely be sufficient, but a lump sum investment and a timeline with milestones would not be effective.The recommended practice is for myself as CIO to put together a small and segregated team. I would identify those who could collectively make for an innovative ecosystem. We would need a few unique authorities, open access to data, healthy and frequent communication with you, as well as sample users to receive constant feedback during development. The process would be highly iterative and fast, building on success and requesting funding injections, and terminating failures quickly. We would look for developers to use their creativity to build some tools on the platform to show its potential. Basically, we would select one project and adopt an agile development process to it.

In regards to selection criteria, we must start with the user, looking at the public and their behavioral characteristics. We should look to isolate a particular area of Massachusetts Government that experiences a lot of pain points, the easing of which would have significant public and social impact. As well as this, it would have to be a project that initial analysis suggest we can solve. The goal is to rethink institutions, making them more digitally native from the ground up. There is no point in investing in a long and difficult project simply to update a current institution.

Wherever we are successful, we can adopt the practice to other institutions and slowly scale laterally. Investing $1B USD of taxpayer funds in an attempt at wide scale reformation in the State Government is not advisable. We should start small, constantly iterate and adapt to test results and build on the successes only.

Options for piloting: Accordingly, the following are a few institutions from which we could select one to begin with;

Registry of Motor vehicles (RMV)

  • Significant pain points and inefficiency. Fixes would be very well received by the people and have significant impact. Potential tools that could be built with road and vehicle data are numerous.

Massachusetts Bureau of Geographic Information (MassGIS)

  • Potential for new services that help in emergency response, real-estate research, environmental planning and management, transportation planning, economic development, engineering services

Dept. of Housing and Community Development (DHCD)

  • Potential for new ways of interacting with community services and funding opportunities that are maybe not well known. Data could be used for community watch services, finding funding for social events and fairs etc.

Final thoughts: Given the nature of this process and the fact that versatility is built into it, standard governance models will be ineffective or obstructive. For example, unlike many traditional development processes, there should not be a governance structure that requires a roadmap upfront, nor milestones. To manage this processes, I recommend an adaptive style of innovation management. Restructuring and rethinking the Government of Massachusetts as a platform will be a process with no end date. It will be a process in constant progress and the public will be constantly increasing their government’s effectiveness as a service provider. Our job would be to build the platform from which innovation and new services can be constantly launched.

We can define success constantly by feedback and observing the extent to which the Massachusetts public engages with the new platform and the extent to which new services are born from it and enjoyed.