foota 11 hours ago

It's mentioned in nested comments, but (as you'd probably expect) meta does not intend to store passwords in plaintext. There was a bug where they were logging plaintext passwords for some period of time e.g., when someone tried to log in etc.,.

  • nolok 10 hours ago

    And they're not fined for storing in plaintext, nor for storing in plaintext by mistake, they're fined because the law give a time limit for you to notify the regulator after you notice it and they waited too long.

    And in this specific case just to be clear it's not about taking too long to notify the public / customer, but about taking too long to notify regulator (the delay is much shorter). And they're not supposed to have perfect facts when they notify it, there is no sanction for notifying and saying "but we're not sure yet" if you're being honest, or coming back later with correction, there is a sanction for not telling them in time.

    We often see "companies should be responsible / should have to inform me" and that's part of that regulation, and it only works if there are clear defined delay and sanction when they're not respected.

    • jart 10 hours ago

      How long has that even been a regulation and in which countries does it apply? Software engineers are trained to view these kinds of things as bugs. Legal isn't trained to monitor bug trackers.

      • oliwarner 8 hours ago

        > Software engineers are trained to view these kinds of things as bugs

        Competent engineers —software or other— must have an education in safety standards and legal regulations. I had a pretty formal education in data protection at both A-Level and undergrad. I know real engineers get tetchy about us programmers edging in, so if you want any claim to an engineering title, ignoring the ramifications of your code in the real world is unacceptable.

        But that doesn't seem to be the problem here. Somebody did know it was bad, did fix it urgently, did report it internally and did an impact assessment. The problem was they needed to notify the regulator earlier so they knew it could have been a problem.

        If these passwords were in the wild, delaying notification by however many days means attackers have more time to use stolen credentials. $100m sounds like a lot but a lot of these regulatory rules scale with the company so that punishments like this have impact. They need to improve how they handle security notification.

        • bigtimesink an hour ago

          An education in legal regulations teaches you the best solution is to make the fix and don't say anything.

        • bcrl an hour ago

          I think the big difference is that engineers have had to care about people dying for the past 100 years. Today there are software developers that throw code out into the world that kills people without any repercussions. At some point this needs to change.

      • alexmorley 9 hours ago

        It’s part of GDPR. I’ve been given training on it at all (3) companies I’ve worked for and training has always included what constitutes a breach and what to do.

        I would hope any company would treat it as an incident rather than just a bug where senior enough folks would be involved to know what their responsibilities are.

      • JCharante 9 hours ago

        it's GDPR from the subheader

        > The Irish Data Protection Commission found that the company violated several GDPR rules.

        this is why lots of websites block the EU from accessing. You basically need to consult with lawyers to make sure you're not accidentally breaking the law when writing a codebase.

        • omnimus 9 hours ago

          “lots” not really as most companies want accesss to european market.

          Also no you dont need to consult lawyers when writing code. You just dont track and save data and do questionable stuff with it. Saving passwords in logs is surely security issue first before its GDPR issue.

          • JCharante 9 hours ago

            > “lots” not really as most companies want accesss to european market.

            Plenty of foreign newspapers block the EU from accessing their sites. The EU is not that a big market.

          • JCharante 9 hours ago

            yes it's a security issue but you wouldn't "expect" to get fined millions of dollars.

            Do I think we should punish companies for storing passwords in plaintext? Yes. Would I expect that a bug and devs untrained in GDPR best practices could lead to fines? No.

            Usually in software engineering you don't get your company fined for making terrible mistakes unless you're in a field like finance. This was just passwords which most sites have, not something like PCI DSS stuff

            • nolok 2 hours ago

              > yes it's a security issue but you wouldn't "expect" to get fined millions of dollars.

              Which is exactly why companies don't care, which is why this regulation was made and those fines decided.

              > Usually in software engineering you don't get your company fined for making terrible mistakes unless you're in a field like finance.

              You're not fined for a mistake, you're fined for a mistake AND that mistake huer the customer more than you AND you don't disclose it swiflty to him.

        • maeil 8 hours ago

          I see this comment pop up here often in these threads about EU fines and regulations. "Apple/company should just call the EUs bluff and stop selling in the EU!".

          Apple's a good example because they're such an incredibly global brand, who should be less reliant on EU customers. Yet Europe is responsible for >20% of their revenue. Shareholders would eat you alive for just "nope"ing away from that.

          Yes, US GDP/capita is far above the EU average, but the EU still represents 450 million, on average fairly wealthy people. So companies simply play ball. And that excludes the UK, whose data protection laws are similarly strict.

          • Eddy_Viscosity2 7 hours ago

            Noping out of europe would create a vacuum that would be filled by a competitor who would fully comply with the consumer protecting EU rules. Rules that many consumers in other nations would love to have themselves but don't because of regulatory capture and regular corruption. Those consumer friendly products would become popular outside of Europe. Big-US-monopoly-company would lose dominance. They would either have to change to be more consumer friendly or eat the loss of market share.

          • JCharante 8 hours ago

            20% of revenue isn't that much. Would you rather focus on your core product and double your revenue, or focus on getting that 20%? Yes giant companies have the resources and experience less YoY growth so they will work with the EU market, but most companies would do better to ignore the EU.

            • nolok 2 hours ago

              > 20% of revenue isn't that much

              It is in reality gigantic, especially at that scale. And in this specific example, Apple net profit is 24% of their revenue.

              > Would you rather focus on your core product and double your revenue

              Saying you no longer sell to people with blue eyes or wearing short is not in any way increasing your sales to other people.

              I'm sorry to you your messages sound like you're not very knowledge about the subject matter.

            • philistine 7 hours ago

              Explain to me how not selling in the EU could double a company's revenue? How removing yourself from your third largest market would enlarge the others? How millions of people in the US don't buy an iPhone because you can buy iPhones in Germany?

        • mdhb 9 hours ago

          I probably encounter this like 5 times a year, your statement is wrong.

    • dotps1 9 hours ago

      To be clear, they are absolutely being fined for storing passwords in plaintext.

      They chose not to mitigate the fine by following proper procedure.

  • oefrha 10 hours ago

    To the (intentionally?) obtuse responses: intending to store passwords in plaintext usually means storing plaintext passwords in databases and doing authentication with that; and that’s what the gazillion of commenters replying to the title are implying.

    Mistakenly logging credentials because of e.g. badly interacting HTTP middleware is still a very nasty bug, but it doesn’t count as intending to store passwords in plaintext.

    And something similar has happened to me, and I’m sure many others: logging middleware storing sensitive user data that shouldn’t be visible to engineers with mere access to logs (not as bad as logging credentials in my case, but still).

    • nicolas_t 10 hours ago

      According to the pci dss auditor I had when doing a pci das level 1 audit. The most common credit card leak vector is logging middlewares logging credit card nunbers

    • Mountain_Skies 10 hours ago

      When I did application security I had to argue with developers about PCI and PII data in logs all the time. They would insist that there was "no other way" and that it was secure inside our system. I'd refuse to change the status of the vuln to anything other than critical. I found many similar vulns in our database that had been marked as false positives or non-critical by other infosec people in the company. The common thread was none of them had a background in software development so they just trusted what the developer told them. This seems to happen frequently in places where there's a culture of compliance being more important than actual security.

    • miningape 10 hours ago

      Can confirm, have broken GDPR before through logging customer (and customer's customer) PII. It lasted for about a month before we realised and I had to go through the logs 1 by 1 over 3 days to remove all the data.

    • slim 10 hours ago

      unless you are Facebook and tightly collaborating with police and intelligence in countries with no respect for human rights

    • nomilk 10 hours ago

      hmm.. that's a very sympathetic take.

      Most frameworks blot out passwords from logs by default, so even a newbie programmer on their first day doesn't make the mistake of logging plaintext passwords, yet facebook somehow made that mistake...

      It should raise eyebrows when the security practices of SWEs at a billion dollar company are outperformed by any newbie developer working a toy project.

      • vidarh 10 hours ago

        Passwords are just data. If said data is not tagged in a way that makes it clear it is a password, finding an algorithm that will successfully blot out passwords in the general case is intractable without being far too aggressive to be useful.

        All such tools rely on assumptions about what will be logged following certain rules that the logging can check against - it's not hard to accidentally convert data to a format that when logged happens to fail these kinds of tests.

        They should have caught it. But it's not surprising that it occasionally happens.

        • nomilk 10 hours ago

          Not that hard [1]:

              Rails.application.config.filter_parameters += [
                :passw, :email, :secret, :token, :_key, :crypt, :salt, :certificate, :otp, 
              :ssn, :cvv, :cvc
              ]
          
          [1] https://github.com/rails/rails/blob/8a2e28d7451d5ae4cb194fcc...
          • vidarh 9 hours ago

            Assuming, of course, that the data is logged as a parameter rather than as a raw string, or as an instance variable in another object, or any number of other ways. Developers thinking it is "not that hard" is a big red flag to me, suggesting odds are high your logs are full of things that should not be there. Using filters is a first step only.

        • tjoff 10 hours ago

          HN does it, if I post my password it will automatically change it to stars, see: *************

      • yunohn 10 hours ago

        I think it’s actually much easier to make such mistakes at large companies with sprawling codebases and potential for settings that inadvertently end up logging sensitive data. Especially for nobody to notice either.

        • nomilk 10 hours ago

          True but they're also better resourced in terms of humans (and their experience) and tooling that should prevent or at least catch any blunders quickly.

          • yunohn 9 hours ago

            You’d be surprised, but from my first hand FAANG experience that is definitely not the case. Tooling can capture things that are known to it - but not everything is setup perfectly.

      • mulmen 10 hours ago

        > It should raise eyebrows when the security practices of SWEs at a billion dollar company are outperformed by any newbie developer working a toy project.

        Facebook isn’t “a billion dollar” company it’s “a 1,435 billion dollar” company.

        Excuses start to run thin.

  • sschueller 10 hours ago

    Boeing did not intend to have its plane crash when it installed the MCAS.

    • sealeck 10 hours ago

      And it turns out that plane crashes are much more serious failures than logging sensitive data internally. Not to say what Facebook has done isn't an embarrasing failure that really shouldn't happen, but they're clearly not the same thing.

  • nudgeee 7 hours ago

    Indeed, this is a common vector for leaking PII and sensitive data. For example, what looks like an innocuous logging/print statement in an exception handler ends up leaking all sorts of data.

    And it gets more messy when you start to ingest and warehouse data logs for on-call monitoring/analytics/etc, and now you have PII floating around in all sorts of data stores that need to be scrubbed.

    In a previous job, we handled credit card numbers. We added PII detectors to logging libraries that would scrub anything that looked like a credit card number. We used client-side encryption where the credit card numbers are encrypted on the client before sending to the backend, so the backend systems never see the plain credit card numbers, except for the system that tokenizes them.

  • tzs 5 hours ago

    Also, it is not against GDPR per se to store passwords in plain text. It is required to keep user data safe from unauthorized access and processing and encryption is one way to help with that, but if you have it sufficiently protected by other means it would be OK under GDPR to have it in plain text.

    Avoiding ever storing passwords (or credit cards) in plain text [1] is actually harder than you might think.

    Even if you outsource password handling by using some third party authorization service so passwords should never even touch your servers, and handle credit card payments by having them handled on your checkout page by a frame or JavaScript or something that only ever sends them directly to your payment processor, you can still end up with the damn things in plain text on your servers.

    How? Because you receive emails to support that look like this:

    > Hello; I'm a subscriber to your service, account "Bob666", password "S3cr!t", billed to my credit card 4111111111111111 with security code 123. That card is expiring. My new card is 4012888888881881 with security code 712, expiration date May 2026. Please use my new card for renewals. Thanks!

    You may also receive messages like that in chat if you offer support via chat. And maybe even to email addresses other than support.

    So now you've got a plain text password and two plain text credit numbers along with their security codes stored in the inboxes of everyone who is on the support list, and possibly also somewhere in the database of your ticketing system if mail to support automatically creates a ticket.

    It gets worse. If you offer support by phone and that goes to voicemail after support hours you will find passwords and credit card numbers in there.

    [1] Note: technically probably almost nothing is actually stored in plain text nowadays. It's almost always going to stored on a filesystem that is using filesystem level encryption, and that filesystem is likely on a block device that is doing block level encryption. But I believe when people talk about "plain text" storage it means at a higher level. If I store the string "this is a secret" in foo.txt, that counts as plain text even though foo.txt is on an encrypted filesystem on an encrypted disk.

  • croes 10 hours ago

    That's why the fine is only $102M

  • delusional 10 hours ago

    > Meta does not intend

    Is an odd concept. Is the argument that nobody noticed? If somebody noticed, but the cleanup was deffered, them they did "intend to".

    It's like defending a bank robber by saying that he didn't intend to rob the bank, he just had a gun in his hand, and then he figured the damage was already done, so he may as well get some money.

    • locallost 10 hours ago

      I think the comment is the context of being a software developer. "Everyone" knows you shouldn't do that, so it would be a bit odd if the company of Facebook's size would. But if it was accidental, then it makes it clearer how it happened. It's still a grave mistake, but not unthinkable. I personally write bugs all the time.

      • tengwar2 4 minutes ago

        "Everyone" does not know that. Avoiding plain text passwords is the commonest method used on the Internet, but if you get in to mobile telecoms, you find shared secrets stored in hardware-secure write-only enclaves in clear text. This is actually done because in that specific environment it increases security. It's not a general solution of course, but "only store encrypted passwords" and "only store password hashes" don't always apply either.

      • cogman10 10 hours ago

        I gotta level with you, not everyone knows you shouldn't do that.

        There's a number of devs that don't think twice about storing sensitive keys in a git repo.

        I could 100% see how someone would do this, see log messages with passwords in plain text, and then faff off being the last person to actually look at those logs. "K, this case is done, what's next"

        • vidarh 10 hours ago

          I worked somewhere not long ago where the "solution" was to run code to scrub repos before building release packages because these packages were in some cases installed in customer networks that drastically increased the chance they might leak. At no point did it seem developers or the devops team realised that the fact they saw the need to do this meant that maybe they should apply the same checks used to scrub in a hook to root them on commit in the first place.

      • osullip 10 hours ago

        Logging passwords on the fly is probably common. Some debug or log action setup and forgotten.

        However, if you ever see a password in plain text you should raise alarms to the highest level.

        In this case, I don't think the alarm was raised.

        • cogman10 10 hours ago

          I agree, but also I know of devs that don't understand the basic security implications of passwords being in logs. I could easily see how someone, maybe even a couple of people, could see these logs and think nothing of them.

        • vidarh 10 hours ago

          Vast quantities of logs are never reviewed by anyone....

Culonavirus 11 hours ago

> a senior employee told Krebs on Security back then that the incident involved up to 600 million passwords. Some of the passwords had been stored in easily readable format in the company's servers since 2012. They were also reportedly searchable by over 20,000 Facebook employees

I thought this was gonna be some limited faux pas... but no. That's terrible.

nh2 10 hours ago

0.1 % of current revenue fine.

If your company made a billion $ revenue per year, it'd have to pay $100k.

Doesn't feel like a great incentive to do it right.

If they improved debuggability by logging all requests to make the company more than 0.1 % efficient, it's a good deal for them.

  • incognition 43 minutes ago

    It’s more like there’s a Director who gets paid $1 million to make sure the logging goes right and he fucked up his job so it’s like he should get fired because they could have replaced him 100x

hannofcart 11 hours ago

Oh so the hashing and rainbow table attack questions they ask in Meta interviews is basically a cry for help?

  • Topfi 11 hours ago

    Salt and pepper? Sure you aren't applying to our canteen?

tveyben 10 hours ago

102M$ might sound like a large sum - but the math shows that a leaked clear text password here is just fined with less than one dollar…

(Yes I have read the fine is triggered by not informing the authorities in due time)

Interesting how the affected user is actually valuated…

can16358p 11 hours ago

I really don't get how companies so large do stupid things like this.

Hashing and salting passwords isn't some newly introduced advanced rocket science, it's literally a 101-level "obvious" thing. How can a huge corporation like Meta/Facebook can do this is beyond my imagination.

  • Negitivefrags 11 hours ago

    The usual way this happens is accidentally logging passwords. Or even other cases where passwords happen to be included in something else. It can happen more easily than you think.

    Like for example, if you collect server side crash dumps, are you really taking care that there is no sensitive information sitting in the memory image stored in them?

    • grayhatter 5 hours ago

      the word you should use is carelessly, as in carelessly logging passwords.

      When working with data that you can reasonably expect to contain secrets, you should behave as if it does contain secrets. It worries me that you mention you're aware that server side crash dumps may contain sensitive data, but you also speak as if it's reasonable to not protect them knowing they do, or they might. I'd hope or expect anyone would mention or at least imply that it'd be negligent to behave so recklessly with someone else's secrets.

      • bongodongobob 3 hours ago

        What you're saying makes sense for a small startup with 20 people. Have you worked in an org of 10000+ that's been around for 30 years? Where the people who built the systems no longer work there? Where the people who maintain them are 3+ career cycles removed? Things get so baked in "one doesn't simply make changes" because no one person understands how everything works, so you do your best and keep closing tickets. "Hey we should really investigate whether or not passwords are included in memory in our 2000 severs across the globe!" "Lol ok maybe when you close out the 30 tickets on your plate."

        • grayhatter 2 hours ago

          > Have you worked in an org of 10000+ that's been around for 30 years?

          yes.

          Would you still make this argument for something else known to be dangerous?

          Hey this gigantic lathe doesn't have a safety shut off switch.

          Ahh yeah the guy that did the wiring for that left years ago so we just don't touch it. It's fine as long as no one puts their hands near it when it's running.

          Hey, isn't this freshly dug 20 foot hole supposed to have an escape route, and side wall reinforcements so it doesn't collapse on someone? lol ok new guy, let me know when you've finished pouring that concrete and then we'll look at that idea.

          To go back to the example; Taking steps to protect them could be as simple as restricting who can access core dumps, enforcing they don't get stored unencrypted, and the only people who can copy and inspect them have are already in a trusted role where they can get root on that production server. Or an even better option would be restricting the service that can read clear text passwords. These machines don't crash, and when they do, we don't write a coredump. (but this trusted team can start if we see the system crashing suddenly)

          > Things get so baked in "one doesn't simply make changes" because no one person understands how everything works, so you do your best and keep closing tickets.

          This is a super disappointing attitude. It's hard, and it's the way we've always done it, so that means it's impossible, or not worth the effort? Yeah, I'm not likely to buy into that mentality. I believe I can fix things and have a positive impact. It's one thing to say it's impossible because you don't understand how to do it, that's wrong, but I guess if that's what you need to believe to sleep at night, I can pretend to understand.

          It's another thing entirely if you don't have the autonomy at work to try to improve something you know to be broken, and a risk to users, and the company. If that's actually what you meant to describe, you might wanna consider if you can find another job. You're clearly not an idiot, and there's plenty of companies that wont treat you like a code monkey.

  • progbits 11 hours ago

    It's unbelievable how little most developers care about security.

    At this point I've given up on educating them since it went nowhere, instead I'm locking down permissions to things like firewall and secret vault so random people don't fuck it up.

    • radicalbyte 10 hours ago

      I don't see that in capable developers (who usually end up pushed to "backend") but it is absolutely endemic in frontend and extremely problematic in organisations which have full stack created frontend-first. All too often even the super seniors / leads have a very limit knowledge of security (or performance, reliability etc).

    • wahnfrieden 11 hours ago

      Their managers don't incentivize them spending time on it, and their PM will fight security tickets they don't understand the need for. Most devs have little autonomy at orgs today and operate under a strict hierarchy of command at the ticket level.

      • pbhjpbhj 10 hours ago

        Presumably, "we're been storing 600 Million passwords in plaintext" is understandable to their PMs given its understandable to complete laymen. Aren't FAANG companies supposed to employ the very brightest minds.

        Hard to imagine this wasn't done on purpose.

    • throwawaysleep 10 hours ago

      Using the terms, wallet, threat, or prestige, make me as a non aultristic person care about the security of the systems my employer owns.

  • herval 5 hours ago

    An accidental print statement on the login page. That’s all it takes.

  • fulafel 10 hours ago

    Big organizations just have a lot of bureucracy attempting to codify common sense. I bet they have paper saying you shouldn't do this.

  • vanjajaja1 10 hours ago

    it happens because you have one component logging everything for traceability, and it hosts the interconnect between 2 other components which need to communicate passwords. thus the password accidentally slips into a plain text log file

    generally there are tools that search for & flag PII logging, if it slips through the tools its because there are layers of indirection involved

  • rjzzleep 11 hours ago

    Most of the times it's when people reinvent stuff from scratch. That's why I'm always careful when people implement their own framework in the next shiny thing. Lack of hashing, CSRF, basic XSS issues happen all the time in rust, golang and especially the node.js web crowd.

  • tjpnz 11 hours ago

    When you select candidates based on whether they know how to invert a BST and other trivia it's not terribly surprisingly.

    • FartyMcFarter 10 hours ago

      How would you select candidates to make sure they avoid this kind of security bug?

      • pbhjpbhj 10 hours ago

        Avoiding the bug in the first place is not the big issue, surely. An employee can leave a door open to a secure area, if you employ security people they should have worked to mitigate that, and to catch it if it did happen.

        Well "we want to run an audit to find if any passwords are stored in plaintext in our file systems". Sounds like a problem any highschooler could answer?

        The idea of an audit might need a person who has done a remedial level of security work.

      • Arbortheus 10 hours ago

        “You are building an account authentication system and need to store user passwords, what design measures would you take to ensure this would be done securely?”

        Follow up question. “What measures would you take to ensure this complies with relevant data protection regulations?”

        If they give a somewhat competent answer, you at least know they’re the sort of developer that can critically analyse this sort of thing.

        • tjpnz 10 hours ago

          >What measures would you take to ensure this complies with relevant data protection regulations?

          We hire devs to work on safety critical systems and do gauge for familiarity with standards applicable to our industry. This happens regardless of whether you're mid career or a new grad. I don't think it unreasonable to expect similar of devs working on systems where there are security or privacy considerations (which might as well be everything).

  • bongodongobob 11 hours ago

    I'm honestly never surprised by any of this stuff. I've done some contracting and file access is always a shitshow.

    Picture this:

    > Intern or contractor gets hired.

    > Someone runs a script to create the user because permissions have turned into a rats nest that no human can understand

    > No one knows how the script works anymore, it's probably outdated and only does 60% of the job

    > User is added to a quagmire of groups

    > I cant access the Citrix apps

    > "Oh you don't have access and I'm waiting on that app owner to give you access, give me a few days"

    > Ok it works but I don't have a login for "the program I need"

    > "Looks like there's a licensing issue, give me a few days"

    > Ok it works but I don't have credentials for the database connection

    > "It's legitimately complicated, give me a few days"

    > IT gets tired of fucking with the back and forth on this 13 day old ticket "Just give him everything, so he can get X done"

    > Get access

    > Finally. Let's just dump this out into another file/folder/location so I don't have to go through hoops again and I can actually do my job.

    Now that dump is in a remote profile folder or /temp folder somewhere in a 100TB blob that will be backed up for 7 years or more. I've been on both sides.

    • bongodongobob 2 hours ago

      To the comment below about measuring security... Usually this comes from putting security theater over pragmatism. Everything is so granular it's impossible to figure out what people need so you fuck around with it for a day and then give up and just give em local admin because actual work needs to be done. You can't get away from the fact that work requires write access. All those groups and policies are meaningless when the rubber hits the road. In the end you just have to hire people who can be trusted to do the right thing and not burn the house down.

DexesTTP 11 hours ago

Context: This is for a 2019 data breach on a system that was created in 2012. The GDPR was instated in 2018 (has it really been that long? Wow feels like yesterday) and Meta failed to disclose the 2019 data breach properly under GDPR, hence the fine.

  • sakisv 11 hours ago

    Honest question: How was it discovered?

    Was it reported by a pentester? (ex-)employee? Facebook itself? How do we know that it goes back to 2012?

    I know in the public sector you have to disclose such things to ICO, but does that also apply to private companies? Who is going to hold them accountable?

  • chrismorgan 11 hours ago

    I was concerned, reading your thing first, that the title (“Meta fined $102M for storing passwords in plain text”) was going to be false—that they were actually only fined for not disclosing the breach. But the article says the decision also claimed a GDPR violation for storing the passwords in plaintext, so that’s good:

    > The DPC found that Meta violated several GDPR rules related to the breach. It determined that the company failed to "notify the DPC of a personal data breach concerning storage of user passwords in plaintext" without undue delay and failed to "document personal data breaches concerning the storage of user passwords in plaintext." It also said that Meta violated the GDPR by not using appropriate technical measures to ensure the security of users' passwords against unauthorized processing.

  • pibefision 11 hours ago

    GDPR fine is 4% of global turnover from previous fiscal year. 102m seems low to me.

    • poincaredisk 11 hours ago

      That's the maximal fine (that was never used as far as I know, at least on a large company). In this case the fine is understandably much smaller, since the privacy incident is not critical, and Facebook reported the problem to the authorities on its own.

    • bootsmann 11 hours ago

      Thats the maximal fine I think, the judges can set the amount depending on the severity of the violation.

bberrry 10 hours ago

I would hope it's not the authentication team's systems that are logging payloads with passwords.. they should definitely know better. Presumably it happened some infrastructure component owned by another team.

  • junon 6 hours ago

    Why are passwords leaving the auth services though?

AmericanChopper 11 hours ago

This is a very imaginative use of the word “breach”, according to the details reported in the article at least. Internal staff (inadvertently) had access to users plaintext passwords. The article doesn’t mention any use of these credentials in a breach though, and doesn’t make any refutation of Meta’s claim that this never occurred. Internal staff having access to my data is what I would normally expect from a service like the ones Meta operates. It’s a bad mistake to make, but contriving these circumstances into being a “breach” is a bit more mask-off than I’m used to the Data Protection agencies being. Hope Ireland makes good use of its $102M.

  • nomilk 11 hours ago

    In many senses, internal staff having access to plaintext passwords is a breach.

    • AmericanChopper 11 hours ago

      It’s a control failure, not a breach. It would also be an incident, one that could result in a subsequent breach, or one that warrants some work to be done to ensure it does not turn into a breach. But it has not resulted in an unauthorised party gaining access to the data, and is therefor not a breach.

      • ethbr1 11 hours ago

        I think it's impossible to say there was no breach, given they were exposed for 7 years.

        • AmericanChopper 11 hours ago

          I would agree. Which is why I’m suggesting that the Irish Data Protection Commission and Engadget should refrain from saying that.

          It’s also impossible to say that I am not responsible for a breach of your private data either. How much should the Irish Data Protection Commission fine me?

          • ethbr1 6 hours ago

            "Potentially breached" would probably be an accurate phrasing.

            If you had my private data written in the back of a notebook, that you carried around with you to coffee shops, for a few days, I'd feel substantially better than if you did it for a few years.

            Likelihood that someone peeked scales with time.

            • AmericanChopper 6 hours ago

              Sure, and I don’t disagree that it’s a bad situation for Meta to have created. It’s being fined for “potentially violating” a statute that I find objectionable. Being breached implies that some harm befell consumers, this article (and the others I’ve read about this incident) don’t make any reference to an actual harm being uncovered.

              • ethbr1 5 hours ago

                I think there should be a "reasonable expectation" of a breach having happened.

                Secrets laying in an accessible place for 7 years... reasonable expectation is someone looked at them.

                • AmericanChopper 5 hours ago

                  It’s a reasonable possibility, it’s also a reasonable possibility that nobody did.

                  But somebody incidentally seeing them, and maybe, maybe not recognising they were passwords is not a breach. Somebody intentionally misusing them would be, and I haven’t seen anything to suggest there’s a reasonable expectation that that occurred.

                  A reasonable expectation is also not the standard of proof I’d generally like to see from a government attempting to enforce a penalty.

                  • ethbr1 4 hours ago

                    I think knowledge of them would absolutely be a breach, because you wouldn't be able to guarantee that person didn't remember and subsequently misuse them.

                    • AmericanChopper 4 hours ago

                      If the data was publicly leaked then this argument would have some merit. But it wasn’t, the only people who could have accessed these passwords were insiders, and those passwords were used to protect data that many of them would have access to anyway.

                      There is no evidence any unauthorised party gained access to any private data as a result of this incident. There is no evidence that any authorised party misused data as a result of this incident. A “breach” that involves no unauthorised access, and no misuse is not a breach.

                      A control failure occurred, and it was remedied in the most appropriate way possible.

      • markarichards 9 hours ago

        It's relatively common for publications to lazily only reference an action that resulted in a legal outcome, rather than the justification provided for the outcome.

        For instance, Bob imprisoned for car bomb rather than Bob imprisoned after judgement rules deaths unlawfully resulted from Bob's malicious car bombing. Had Bob's car bomb been on a film set and no one hurt, Bob would hopefully be fine.

        If you read coverage with this in mind, then what matters is more a case of how likely an action is to be unlawful and thus how lazy the publication is being.

        If someone blows up a car, we'd assume it was unlawful. If a company stores passwords unlawfully we'd assume it was unlawful and hopefully for good reason...

        From GDPR: "personal data breach’ means a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed"

        A typical security policy for securing passwords is to never store them in plaintext.

        It would be a rare situation for the storage to not be accessible (what would be the point of storing it).

        Thus it would seem fair to assume that in most cases plain text storage of passwords would be a breach of security (internal controls breach) would implicitly also be a breach of personal data (legal definition) as it would at the very least be accidentally accessible to staff, contractors or third parties (whoever hosts the storage).

        So, it will likely fit the definition of a breach.

        But, it still needs to escalate to a point where it would be recognised as serious enough to warrant action (like reporting to data subjects or regulators).

        There are situations where storing passwords in plaintext may not warrant reporting or fines, such as if upon realising the breach it was evident that nobody had accessed the data and it was destructed before harm could be realised; but I doubt anyone would ever know about these situations happening in companies so it's fair to assume they wouldn't reach major news sites.

        • AmericanChopper 8 hours ago

          Even by the broadest possible definition of a breach, this is still just a control failure rather than a breach. The control that failed might have made it possible for Meta employees to perpetrate a breach, but the article makes no mention of that happening, or provides any suggestion that there is evidence that it might have happened.

          At at least one point in my career, I have also accidentally mishandled password data (I accidentally leaked them into a log one time - well one time that I know of at least). When I did that I caused a control to fail, and I caused a security incident that required follow up remediation work (including password resets and disclosure), which is exactly what happened here. But I did not cause a data breach to occur. I struggle to image a world where I could have caused my employer to be fined $102M for that incident, and for that to be deemed a data breach, when there is no evidence (presented or referenced in this article at least) that a breach ever occurred. If I leave the office and forget to lock the door, I've caused a control failure. But if nobody comes in to rob us, then I haven't caused a robbery or a breach or anything else like that to occur, even if a typical security policy might require me to lock the door before leaving.

          The creativity required to come to this conclusion doesn't do anything to improve the credibility of the GDPR, which from an outside perspective really doesn't look like anything other than an import tariff on foreign tech in disguise.

          • markarichards an hour ago

            I like to think of a breach as hole through into the hull... they don't mean the boat will sink or even ever will sink; just that the layers of security protections has been compromised.

            In the case you mention it seems that happened too: internal actors could reach plaintext passwords and thus for safety the company responded by forcing password reset and disclosure (commendable as I know of companies that would not).

            The term "personal data breach" is useful because it defines the range of breaches that the law focuses on (it's not interested in business data or incidents where the first layer of defence fell but the second kept it secure).

            I feel it's a bit like having a determination for "road traffic incident". It helps the public, police, etc identify what is in scope... just because you have one doesn't mean you'll lose your licence or be fined - that depends on a range of factors regarding the lead up to the incident: what happened before, during and after. Similar with data breaches.

            If a company has a breach it does not mean much in GDPR unless other factors are considered, so I wouldn't worry about being too focused on the term breach.

  • shprd 11 hours ago

    > This is a very imaginative use of the word “breach”

    You're mistaken. You might be thinking of breach in terms of "hacking into", but they used it as:

      personal data breach 
    
    Which accurately means "unauthorised access to personal data"[0] and seem to be the language used by the DPC.

    [0] - https://ico.org.uk/for-organisations/law-enforcement/guide-t...

    • AmericanChopper 11 hours ago

      That describes an entirely different incident to the one referenced in this article.

      • shprd 11 hours ago

        Edited the link out. It doesn't make a difference anyway for the purpose of this discussion.

        • AmericanChopper 10 hours ago

          You didn’t edit the link out, you replaced your comment with a completely different one. I always though HN was pretty good at preventing ninja edits like that…

  • smittywerben 8 hours ago

    I don't know about you but if tens of millions of passwords stored in plaintext are accessible to 80k people they're as good as useless now. You're thinking too much like "hacker selling data security" and not enough like "stalker who works at facebook logged into my gmail because I use the same password as my facebook" regular bob security. Just because you didn't end up in a dataset on some forum doesn't mean that someone's ex has a boyfriend who started his first day at facebook and left his laptop unattended and the ex saw see your facebook password is your "cat's name + 123" in the debug log and nobody at Facebook says anything for years or something.

    Anyways I think it's fine for them to define breach as the loss/destruction of data i.e. making a password known, which destroys it's value.

    • AmericanChopper 5 hours ago

      If this control failure allowed malicious insiders to access private data, and misuse people’s personal accounts, then a data breach would have actually occurred. But I haven’t seen any suggestion that this happened, only references to the possibility that it might have happened. I’m really just thinking like somebody who believes that if the government is going to punish you for something, then I believe the event you’re being punished for should have actually occurred, and also that they should be able to prove it occurred.

      If reference to standard security policies formed part of the basis of this decision (as the article states), then the harm that you’re trying to contrive into existence here also has no merit. There is no framework of information security that allows for a password to permanently retain its value as a secret keeping tool. Conventionally passwords have only retained their value for a set period of time, and even the most modern security standards for managing secrets requires you to rotate them at even the most remote possibility that they were exposed. The idea that a password rotation has harmed Facebook users, and the implication that their password was a valuable asset that they could reasonably expect to retain its value forever is quite ridiculous.

      • smittywerben 2 hours ago

        I could agree that this fine is bureaucratic Big Compliance enforcing its made-up standards. At the same time, it's hard for me to feel bad for Facebook.

        If someone's violating internal auditing procedures, those same procedures won't catch them. It's dangerous because it's a violation of the procedure itself. Proving such violations without tools like no-knock warrants or the NSA moving in is nearly impossible.

        So you end up with a misappropriated circus of Big Compliance issuing fines over no wrongdoing and internal audits finding no wrongdoing when you rarely hear about this type of internal abuse unless someone is careless enough to brag about it to their Tinder date.

Myrmornis 9 hours ago

Who gets the money and what will it be spent on?

qwerty456127 10 hours ago

Wow I didn't know this is illegal.

  • grayhatter 6 hours ago

    you didn't know that it was illegal to be careless in a way that's well known to cause harm to other people?

  • cubefox 9 hours ago

    Certainly depends on the country.

schleck8 11 hours ago

Can't wait to give them access to everything I do on the daily by wearing their AR glasses.

nomilk 11 hours ago

[flagged]

  • Mountain_Skies 10 hours ago

    Most of them have signed an NDA promising not to do it. Is that not enough? /s