Code or Clear? Encryption Requirements (Part 2)
In the last post, I talked about the role of encryption in fashioning a “reasonable” security plan for sensitive personal information and other protected data routinely collected, stored, and used by an enterprise. But lawmakers and regulators are getting more specific about using encryption and managing data that is risky from an ID-theft perspective. Here are some leading examples of this trend.
State Security and Breach Notification Laws
Since California adopted SB 1386, which went into effect in 2003, nearly all US states have enacted security breach notice laws that require notice to affected individuals, and in some cases to public authorities, when a party has reason to believe that the security of protected categories of personal data has been compromised. The protected categories are typically SSN (Social Security Number), driver’s license, financial account or payment card details (usually only if the password or access code is also compromised), and, increasingly, medical data not covered by federal HIPAA privacy protections.
All of these laws make an exemption from the notice obligation if the data were encrypted (some add that this is true only if there is no reason to believe that the decryption key was also compromised). The laws, and regulations adopted under the laws, typically do not specify the level or kind of encryption. For example, California’s Office of Privacy Protection published guidance specifically on the subject of “Recommended Practices on Protecting the Confidentiality of Social Security Numbers” in April 2007, which has only this to say about encryption, on page 11:
“Protect records containing SSNs, including back-ups, during storage by encrypting the numbers in electronic records or storing records in other media in locked cabinets.”
Partly as a consequence of these security and breach notice laws, organizations should limit their use and storage of these categories of personal data to the extent they are really necessary for business operations. Storage on servers or on archived media, and transmission over internal networks and VPN connections, may or may not be sufficiently secure without encryption, depending on the company’s risk assessment and IT security practices. Organizations should encrypt such data when it is resident on laptops or other portable devices and when it is in transit over the public Internet.
Massachusetts and Nevada have recently adopted stricter and more specific rules, however, that may become a model for other states. These increase the regulatory pressure for encrypting protected categories of personal data.
Massachusetts
The Massachusetts Personal Information Security Regulation (201 CMR 17.00) is now scheduled to take effect on March 1, 2010. The Regulation was promulgated by the Office of Consumer Affairs and Business Regulation (OCABR) under the authority of the Massachusetts personal information security law.
The Regulation will require all parties that “own or license” any of the protected categories of personal data concerning Massachusetts residents to encrypt the data in laptops or other portable devices, as well as in wireless transmissions and in transmission over public networks.
Note that the Regulation does not limit its coverage of financial account data to cases where the access code or PIN is compromised, as do most security and breach notice laws. The Regulation extends to any nonpublic financial account or payment card data, as well as to SSNs and driver’s license numbers. The Regulation does not cover medical information, however.
The Regulation mandates a number of “Computer System Security Requirements” (201 CMR sec. 17.04) for businesses that handle the protected categories of personal data. These expressly include the following:
“(3) Encryption of all transmitted records and files containing personal information that will travel across public networks, and encryption of all data containing personal information to be transmitted wirelessly . . .
(5) Encryption of all personal information stored on laptops or other portable devices . . .”
The level and type of encryption are not specified.
Nevada
Nevada recently amended its personal information security law, which already required “reasonable” security measures as well as breach notice (Nevada Rev. Stats. secs. 603A.010 et seq.). The amendments take effect on January 1, 2010.
The law covers SSNs, driver’s license numbers, and payment card or financial account data in combination with an access code or PIN. Medical information is not covered.
Under the amended law, businesses that accept payment cards (credit cards and debit cards) must comply with the Payment Card Industry Digital Security Standard (PCI DSS). In addition, a party handling any of the protected categories of information must encrypt the data if it transfers the data electronically “outside of the secure system of the data collector” or if the data is stored on a device (laptop, USB drive, etc.) that is moved “beyond the logical or physical controls of the data collector or its data storage contractor.”
“Encryption” is defined in the amendments with reference to “established standards,” specifically including FIPS and mentioning the need for standards-based key management as well as encryption protocols:
‘Encryption’ means the protection of data in electronic or optical form, in storage or in transit, using:
(1) An encryption technology that has been adopted by an established standards setting body, including, but not limited to, the Federal Information Processing Standards issued by the National Institute of Standards and Technology, which renders such data indecipherable in the absence of associated cryptographic keys necessary to enable decryption of such data; and
(2) Appropriate management and safeguards of cryptographic keys to protect the integrity of the encryption using guidelines promulgated by an established standards setting body, including, but not limited to, the National Institute of Standards and Technology.”
Thus, while the law itself does not specify the form of encryption, it puts the burden on the user to choose an appropriate and standards-based method.
HITECH
Title XIII of ARRA, the federal economic recovery legislation adopted early in 2009, is labeled the Health Information Technology for Economic and Clinical Health Act (HITECH). It amends the HIPAA medical privacy provisions by adding a federal security breach notice requirement for nonpublic, personally identifiable health information. While HIPAA applies only to certain covered entities (healthcare providers and insurance companies and clearinghouses), HITECH also applies to “business associates” that provide services to those entities. HITECH reaches as well any employers that are covered by HIPAA because, for example, they operate company clinics or manage their own health plans.
HITECH requires notice to affected individuals when there has been a security breach exposing personally identifiable health data. HIPAA already lists 18 identifiers (names, addresses, SSNs, health plan ID numbers, etc.) that must be removed to establish that health records have been “de-identified.” Where compromised records have not been fully de-identified by removing these data fields, HITECH sec. 132400 also recognizes that the information may not be personally identifiable if it is effectively encrypted:
“(b) Implementation specifications: Requirements for de-identification of protected health information. A covered entity may determine that health information is not individually identifiable health information only if:
(1) A person with appropriate knowledge of and experience with generally accepted statistical and scientific principles and methods for rendering information not individually identifiable:
(i) Applying such principles and methods, determines that the risk is very small that the information could be used, alone or in combination with other reasonably available information, by an anticipated recipient to identify an individual who is a subject of the information; and (ii) Documents the methods and results of the analysis that justify such determination; . . . .”
Thus, HITECH does not specify a particular form of encryption but leaves it to IT security experts to decide whether the data are effectively unidentifiable in the hands of an unauthorized user. Note that the statute requires covered entities to maintain documentation of this professional analysis, and that the analysis must be based on “generally accepted” principles and methods – which means that professional opinions are likely to refer to published specifications and industry standards.
Red Flags
The 2007 Identity Theft Red Flags Rule (promulgated under the 2003 FACTA amendments to the federal Fair Credit Reporting Act) went into effect in November 2008, although the FTC suspended enforcement until November 1, 2009. (Similar rules were issued by the federal financial regulatory agencies, for the institutions they supervise.) The Rule requires covered entities to develop and implement written policies to prevent identity theft, including recognition of warning signs or “red flags” of suspected ID theft.
The Rule applies not only to traditional financial institutions but to “creditors,” defined as companies that “regularly defer payment for goods or services,” whether or not charging interest or finance charges, and therefore store personal information about individual debtors. Some employers, for example, sell goods or services to employees on deferred payment terms and may be treated as covered entities for that reason. (However, the Red Flag FAQs written by FTC staff take the view that an employer is not a covered entity simply because it sponsors a 401k or other qualified retirement plan that allows participants to borrow from their retirement funds.)
For covered entities, the mandatory policy to prevent ID theft must identify signs of possible security breaches involving certain data, as well as appropriate responses to those alerts. The covered data are SSNs and tax identification numbers, healthcare IDs, financial account and credit/debit card details, personally identifiable medical information, and identifying data from consumer reports (which are often used for employee background checks as well as for credit applications).
The Rule itself does not mandate encryption measures. However, most covered entities will necessarily address encryption in their written anti-ID theft policies. Their “red flags” should also include an alert if there is evidence that encryption keys have been misused, stolen, or hacked.