Neopets, a website that allows children to care for “virtual pets,” has exposed a wide range of sensitive data online including credentials needed to access company databases, employee emails, and even repositories containing the proprietary code for the site, according to information shared with The Security Ledger.

The data includes the IP addresses of Neopets visitors, information that could be used to target Neopets users, according to independent researcher John Jackson, who said he discovered the information after scanning the company’s website with a security tool.

Stolen Accounts For Sale

Neopets is a “virtual pet website” that first launched in 1999. It permits users – many of them children – to care for virtual pets and buy virtual items for them using virtual points earned in-game (Neopoints) or with “Neocash” that can be purchased with real-world money, or won in-game. Purchased by Viacom for $160 million in 2005, in 2017, it was acquired by the Chinese company NetDragon.

In an email to The Security Ledger, Jackson said that he noticed Neopets accounts being offered for sale on an online forum. That prompted him to run a scan on the Neopets site using a forensics tool. That scan revealed a Neopets subdomain that exposed the guts of the Neopets website, Jackson said via instant message.

China Using Big Brother-Like System to Track, Monitor Minorities

“We looked through and found employee emails, database credentials and their whole codebase,” he said.

Jackson shared screen shots of the Neopets directory as well as snippets of code captured from the site that suggest credentials were “hard coded,” or embedded in the underlying code of the website. Working with security researcher Nick Sahler, Jackson was able to download Website’s entire codebase, revealing database credentials, employee emails, user IP addresses and private code repositories. The two researchers also uncovered internal IP addresses and the underlying application logic for the entire Neopets application.

Snippet of code from the NeoPets website showing hard coded credentials. (Image courtesy of John Jackson.)
Snippet of code from the Neopets website showing hard coded credentials. (Image courtesy of John Jackson.)

“This is extremely bad because even though we didn’t attempt to access PII (personally identifying information), with these codebases we can undoubtedly do so,” Jackson said. “They need to fix the root issues, otherwise they will suffer yet another threat-actor related breach.”

U.S. Customs Data Breach Is Latest 3rd-Party Risk, Privacy Disaster

Jackson and Sahler said they have reported their findings to Neopets and provided copies of email exchanges with a support tech at the company who said he would pass the issue to “one of our coders.”

Neopets has not yet responded to requests for comment on the researchers’ allegations.

If true, this would be the second serious security incident involving the Neopets site. In 2016, the company acknowledged a breach that spilled usernames, passwords, IP addresses and other personal information for some 27 million users. That breach may have occurred as early as 2013, according to the website HaveIbeenPwned.

The issue appears to be related to a misconfigured Apache web server, Jackson said. Though many web-based applications are hosted on infrastructure owned by cloud providers such as Amazon, Google or Microsoft’s Azure, leaked documents indicate that the 20 year-old Neopets website continues to operate from infrastructure it owns and operates.

Episode 145: Veracode CTO Chris Wysopal and Life After Passwords with Plurilock

Misconfigured web servers are a frequent source of security breaches -whether self-hosted or hosted by a third party. In 2017, credit rating agency Equifax acknowledged that a hole in the Apache Struts platform first identified in March, 2017 and patched in August of that year was used by hackers to compromise a web application and gain access to the information which included names, email addresses and, for US residents, Social Security Numbers. The vulnerability, identified as CVE-2017-5638, was associated with a string of attacks in 2017 and 2018.

High Bar for Collecting Information on Children

The breach could spell legal trouble for Neopets andWebsites and NetDragon. Online firms that manage information on children are held to a high standard under the federal Children’s Online Privacy Protection Act (“COPPA”).

In June, the U.S. Federal Trade Commission (FTC) announced that it reached a settlement with children’s mobile application developer HyperBeard Inc. that included a $4 million fine for COPPA violations for obtaining parental consent before processing children’s personal information for targeted advertising. (HyperBeard ultimately paid just $150,000 of that penalty, citing an inability to pay the full amount.)

In September, 2019 Google and its YouTube subsidiary agreed to pay a record $170 million fine to settle allegations by the Federal Trade Commission and the New York Attorney General that the YouTube video sharing service violated COPPA by illegally collecting personal information from children without their parents’ consent.

There are billions of Internet of Things (IoT) devices out there in the world and this number will only grow. I’ve written before about smart light bulbs and smart security cameras and it’s no secret that I am fascinated by IoT technology. When I came across the Mozilla *privacy not included guide, I knew I had to share this website.

The guide includes several “smart” products for home and office and provides brief summaries of any relevant and available information related to the privacy of a particular product. The purpose of the guide is to share information regarding the privacy and data collection practices for the 136 smart products listed on the website. Clicking on a particular product on the website will provide a summary of the product’s data collection and privacy policies. Users are also able to rate products along a “creepiness” scale.

The standards that the guide uses include: whether a product uses encryption, automatic security updates, requires strong passwords, whether it has a system to manage vulnerabilities, and whether the privacy policy is accessible. According to the website, a new feature of the guide includes warning labels on certain products that consumers should “think twice about before buying.” Items marked with a yellow triangle icon with an exclamation point include the following: “warning: *privacy not included with this product.” The website includes additional information and answers questions about whether a product can snoop on you, whether an email address is required to sign up, and what personal data the device collects; all important things to know before you connect that smart product that you may be buying.

As I wrote about previously on our blog, the Massachusetts Right to Repair amendment passed in November is up against a lawsuit from auto manufacturers. Now, the Massachusetts’ Attorney General’s office has responded stating that the state law does not conflict with any federal statute and that voters already rejected all of the lawsuits allegations. The Attorney General’s office further argues that the primary claim of this lawsuit relies on non-binding agency guidance, which is simply not enough to preempt the amendment. There is a heavy burden for facial, pre-enforcement challenges established by the Supreme Court and the First Circuit. At this point, the Attorney General has agreed not to enforce the law until the litigation has concluded. Massachusetts argues that rejecting the law before it takes effect is subversive to the democratic process. The case is set for a bench trial in June 2021. We’ll follow the case as it makes its way into the new year.

On December 11, 2020, California Attorney General Xavier Becerra released the fourth set of proposed modifications to the regulations of the California Consumer Privacy Act of 2018 (CCPA). This fourth set of proposed modifications is in response to comments received to the third set of modifications that were released on October 12, 2020. According to the update released with the proposed text, the changes include:

Revisions to section 999.306, subd. (b)(3), to clarify that a business selling personal information collected from consumers in the course of interacting with them offline shall inform consumers of their right to opt-out of the sale of their personal information by an offline method; and

Proposed section 999.315, subd. (f), regarding a uniform button to promote consumer awareness of the opportunity to opt-out of the sale of personal information.

The text of the proposed modifications can be found here. Probably the biggest news for the opt-out option is the proposal to include an opt-out button, which may be used in addition to posting the right to opt-out, but not in lieu of any requirement to post a “Do Not Sell My Personal Information” link. The proposed regulations state that if a business posts the “Do Not Sell My Personal Information” link, then the opt-out button shall be added to the left of the text as follows:    The proposed modifications also add language that states that submitting requests to opt-out shall be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out. Businesses are not to use confusing language for opt-out requests or to require consumers to click through or listen to reasons why they should not submit a request to opt out. Businesses may not require consumers to provide personal information that is not necessary to implement the request, nor can a business require the consumer to search or scroll through the text of a privacy policy to locate the mechanism to opt out. In short, the proposed modifications appear to  strive for a simple process with minimal steps for consumers to opt out of the sale of their personal information.

The Attorney General’s Office will accept written comments on the proposed changes to the regulations until 5:00 p.m. on December 28, 2020. Comments may be sent by email to [email protected] or by mail at the address contained in the notice of the fourth set of proposed modifications.

A group of automakers through the Alliance for Automotive Innovation is suing Massachusetts in federal court to block the new ‘Right to Repair’ law that passed on November 3rd. This law was known as “Question 1” to Massachusetts residents hitting the polls earlier this month.  As we discussed in our prior blog post, the new state law expands access to certain diagnostic and repair data collected by onboard computer systems that is currently only accessible in ‘real-time’ by the manufacturers (and in turn, their dealers). The lawsuit argues that it will impose a financial burden on auto manufacturers and threatens the privacy of car owners by exposing data from their vehicles. We discussed many of these privacy and security concerns in our post back in October when consumers were still contemplating whether they wanted their small autobody shops to have more access to their data or to prevent more sharing of their vehicle’s data.

The lawsuit asks the court to declare the new Right to Repair expansion to be legally unenforceable. It claims that this new law violates numerous federal laws related to cybersecurity and intellectual property. The lawsuit also poses the arguments that auto manufacturers made during the ballot campaign: that independent autobody shops already have access to the data they need to fix consumers’ vehicles under the existing Right to Repair law.

Moreover, manufacturers say the requirement that they install a standardized  “platform” on all cars equipped with telematic technology sold in Massachusetts by model year 2022 forces them to implement the requirement immediately because the first production of 2022 models are already getting ready to hit the market.

Finally, the lawsuit relies heavily on the testimonial letter that the National Highway Traffic Safety Administration (NHTSA) sent to a committee of the state legislature back in July that stated that Question 1 posed new cyber risks by compromising the integrity of a vehicles functions such as steering, acceleration and braking. However, the NHTSA also stated in its letter that manufacturers should continue to control those vehicle functions, which, on its face, the new Right to Repair Law also seems to support (i.e., the new system in 2022 models will communicate “mechanical data,” and the proposed definition of “mechanical data” states that it includes information that is “related to the diagnosis, repair or maintenance of the vehicle.” This would NOT include telematics data collected related to an immobilizer system or security-related electronic modules. That exception is not being stricken by these proposed revisions). 

We will follow this lawsuit to see how it shapes access to vehicle data not only in Massachusetts but across the country as a whole as more and more cars are equipped with real-time telematics data collection and transmission.

The California Consumer Privacy Act (CCPA) requires businesses covered by the CCPA to notify their employees of the categories of personal information the business collects about employees and the purposes for which the categories of personal information are used. The categories of personal information are broadly defined in the CCPA and include personal information such as medical information, geolocation data, biometric information, and sensory data.

As a result of the COVID-19 pandemic, many businesses are conducting screenings of employees for COVID symptoms. In many states, it is either required or recommended that businesses conduct such screenings of employees prior to entering the workplace. These employee screenings vary across the country but many include documenting an employee’s temperature, whether they have any COVID-related symptoms or exposure to individuals with COVID-19, or documenting travel out of state or out of the country. States vary too, in the method of collection of this information, with employees completing a written questionnaire via email, text, or mobile application. COVID-19 screening and temperature data is recorded and kept daily to demonstrate compliance with state and local public health requirements.

So, what does this mean for CCPA compliance? None of us could have predicted a year ago that employers would be collecting temperature data, lists of symptoms, and travel information from our employees. If you drafted your CCPA employee notice prior to the start of the pandemic, you may want to review the categories of personal information you now collect in light of these COVID-19 data collection requirements and recommendations. For example, depending upon the type of temperature check, this data could be considered biometric information or sensory data. Your employee notice may also need to disclose how such categories of personal information are used by the business, such as to comply with state and local public health requirements.

While the CCPA requires notice to employees of the categories of data collected, in light of the pandemic, businesses may wish to review their employee notice to determine if it needs to be updated to accurately reflect any additional categories of personal information collected and how the business is using that personal information.

With the passage of the ballot initiative known as the Consumer Privacy Rights Act (CPRA or Act) in California, we are presenting several blog articles on different topics related to this new law. Last week, we wrote about the newly-added definition of sensitive information. This week we will focus on some key effective dates in the CPRA along with what it will mean to have a separate privacy rights enforcement agency.

CPRA Effective January 1, 2023

The good news is that the CPRA’s effective date is January 1, 2023, so businesses have some time to assess and get ready for the new law while the California Consumer Privacy Act (CCPA) is still in effect and enforceable. The CPRA functions like an overlay to CCPA. Once the CPRA takes effect in 2023, it will become the privacy law of the land in California.

There is one exception to the 2023 effective date and that is with respect to the right of access. The CPRA’s right to know or right of access applies to personal information collected by a business on or after January 1, 2022. The exemptions for employee information and business-to-business information remain in place until January 1, 2023. The CPRA also provides additional rulemaking authority, which may also take place prior to the effective date.

Creation of the California Privacy Protection Agency

Section 24 of the CPRA creates the California Privacy Protection Agency (CPPA or Agency), established in the state government of California. The Agency is vested with full administrative power, authority, and jurisdiction to implement and enforce the California Consumer Privacy Act. Section 1798.199.10(a) states that: “[t]he Agency shall be governed by a five-member board, including the Chair. The Chair and one member of the board shall be appointed by the Governor. The Attorney General, Senate Rules Committee, and Speaker of the Assembly shall each appoint one member. These appointments should be made from among Californians with expertise in the areas of privacy, technology, and consumer rights.” Subsection (b) states that the initial appointments to the Agency shall be made within 90 days of the effective date of the Act.

The board will have the authority to appoint an executive director and the Agency will have broad powers to protect “the fundamental privacy rights of natural persons with respect to the use of their personal information.” Section 1798.199.40 (c). The CPRA allows individuals, businesses, customers, advocacy groups and vendors to file complaints with the Agency regarding the privacy practices of a business. The Agency will have the power to investigate complaints, to hold hearings to determine if a violation has occurred, and to issue orders to: cease and desist, and to pay an administrative fine up to $2,500 for each violation or up to $7,500 for each intentional violation as well as each violation involving the personal information of minor consumers. The Agency also has the power to bring a civil action in the superior court for the purpose of collecting unpaid administrative agency fines.

The Agency also is charged with providing guidance to both consumers and businesses regarding their rights and responsibilities under the CPRA. One final note is that Section 1798.199.100 states that the Agency “shall consider the good faith cooperation of the business, service provider, contractor, or other person in determining the amount of any administrative fine or civil penalty for a violation of this title.”

This week, the Canadian government proposed new legislation in Bill C-11, or the Digital Charter Implementation (the ACT), which includes some hefty fines for companies for violations – up to 5 percent of their revenue or C$25 million, whichever is higher. The Act would increase protections for Canadians’ personal information by giving citizens more control and greater transparency from companies handling their information. The Act addresses consent, data portability, consumer control over their “online identity” and disposal of personal information, as well as de-identification rules. A Fact Sheet about this proposed law outlines the effect on Canadian citizens and their privacy rights.

This Act would update the existing federal Canadian privacy law (i.e., the Personal Information Protection and Electronic Documents Act, or PIPEDA) by requiring a privacy management program that is submitted to the Office of the Privacy Commissioner upon request.

This revamp from the Canadian government possibly stems from the challenge to international data flows in the recent Schrems II decision in the European Union and as the U.S. considers its own federal privacy legislation once again.

Part of the Bill also includes the introduction of the Personal Information and Privacy Protection Tribunal Act (PIPPTA), which seeks to establish a faster path for enforcement of orders of the Office of the Privacy Commission and expand the office’s role and implement strong enforcement.

We will watch this closely as it progresses.

How will a Biden-Harris presidency affect the U.S. privacy landscape? Let’s take a look.

Federal Privacy Legislation

On both sides of the political aisle there have been draft proposals in the last 18 months on federal privacy legislation. In September, movement actually happened on federal privacy legislation with the U.S. Setting an American Framework to Ensure Data Access, Transparency and Accountability Act. To read the bill, visit

With a Biden-Harris administration, there is potential for continued movement on federal privacy legislation. This movement would likely come from Congress since both the Republicans and Democrats have previously supported (and are pushing for) privacy bills.

E.U.-U.S. Privacy Shield and Data Transfers

With the 2020 “Schrems II” decision  looming over international data transfers, the Biden-Harris administration is likely to pave the way for negotiations with the European Commission for a new version of the Privacy Shield. However, the Schrems II ruling will continue to be a real challenge. The hope is that there can be effective, productive dialogue with the E.U. and that the U.S. can convey the fact that there is a mutually beneficial relationship with intelligence agencies in the U.S. and member states of the E.U.

FTC Enforcement and FCC Rules

During Chairman Joseph Simons’ tenure, the Federal Trade Commission (FTC) has been very active on privacy issues. Examples include the FTC’s enforcement actions against Facebook, Google and YouTube, as well as the Children’s Online Privacy Protection Act (COPPA) rulemaking proceeding held in 2019. Just this past week, the FTC announced a settlement with Zoom for alleged data security failings. While the FTC was certainly busy under a Republican-led agency, it is likely that we will see a heightened level of scrutiny and more enforcement under a Biden-Harris administration. While Chairman Simons can serve until 2024, he might step down, and it is also likely that the FTC will gain more Democratic commissioners.

For the Federal Communications Commission (FCC), a Biden-Harris administration may also lead to a revival of the net neutrality rules.


Many experts agree that cyber-attacks are the number one national security threat in the U.S., both from a geopolitical and an economic standpoint. A recent report, the Cyberspace Solarium Commission report, states that one of the biggest reasons for continued cybersecurity issues in the U.S. is the failure of strategy and leadership in this arena, and that now is the time for greater accountability of the government to defend against cyber-attacks.

Big Tech and the U.S.’s International Relationships

There has been a lot of scrutiny on how a Biden-Harris administration will regulate Big Tech in Silicon Valley. Biden has already pledged to create a task force for investigating online harassment, extremism and violence, so it is likely that there will be a focus on privacy, surveillance and hate speech online through some of the Big Tech players in Silicon Valley. We may also see some shifts in the U.S.’s relationship with China when it comes to privacy.

Of course, none of this change will happen overnight, so we’ll be watching as the train chugs forward.

The California Privacy Rights Act (CPRA) expands the definition of personal information as it currently exists in the California Consumer Privacy Act (CCPA). The CPRA adds “sensitive personal information” as a defined term, which means:

(l) personal information that reveals:

(A) a consumer’s social security, driver’s license, state identification card, or passport number;

(B) a consumer’s account log-in, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account;

(C) a consumer’s precise geolocation;

(D) a consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership;

(E) the contents of a consumer’s mail, email and text messages, unless the business is the intended recipient of the communication;

(F) a consumer’s genetic data; and

(2) (A) the processing of biometric information for the purpose of uniquely identifying a consumer;

(B) personal information collected and analyzed concerning a consumer’s health; or

(C) personal information collected and analyzed concerning a consumer’s sex life or sexual orientation.

This is perhaps the broadest definition of personal information in the country as it now includes entirely new classes of personal information such as racial, ethnic origin, religious or philosophical beliefs or union membership, the content of a consumer’s mail, email and text messages, genetic data, biometric data, and data collected and analyzed concerning a consumer’s health or sex life or sexual orientation.

What does this mean for a business that is covered by the CPRA? In a previous post, we provided a detailed overview of  the CPRA, but suffice it to say that if the business had to comply with CCPA, it also will likely be covered by CPRA. Given this new definition of sensitive personal information, one of the first steps in thinking about CPRA compliance will be to think about data mapping to determine whether the business collects any of these new categories of sensitive personal information. The CPRA is still very much a consumer-focused law with the goal of expanding consumer knowledge about the types of personal information businesses collect about consumers and how that personal information is used, sold, or shared. It will be a critical first step for businesses to understand the data and personal information they collect about consumers and whether they collect any sensitive personal information under this new definition.