Read more.mozjpeg 2.0 is being tested by Facebook to save server burden, help user pages load faster.
Read more.mozjpeg 2.0 is being tested by Facebook to save server burden, help user pages load faster.
A whole 5% off of a small lossy file? and they aren't compatible with some decoders?
Hmmm, no wonder Firefox has hit rock bottom, they obviously don't know where to allot dev time!
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Facebook transfers 100s of terabytes of data per day, and most of that is images! 5% of just 100TB is 5TB, more than you probably have in your whole system. It's a lot of savings for big companies, imagine how much money Google could save by applying this to Google image search! So much saving that will indirectly benefit the users in the long run
To be fair, if Facebook wanted to reduce their image overhead by 5%, they'd proably just dial down the JPEG quality slightly. I doubt most users would actually even notice.
Don't get me wrong, it's cool and all, but I'm not sure it's a problem that particularly needed to be solved - not least by an organisation that's struggling to stay relevant in the browser arena at the moment.
A new codec/algorithm means you need to update all the software that consumes the content to support it.
We have already been on this path: JPEG2000, JPEG XR, WebP. All of them offer great advantages over plain JPEG but none of them reached widespread support yet.
The case for WebP which is the most modern format is that only Chrome and its derivatives support it natively, and only some software/websites allow you uploading WebP files that are ultimately converted into JPEG for public consumption (Facebook, Gmail, Drupal/Magento CMS).
For a new image format to gain traction it needs to be supported by the 4 big browsers out there: IE, Firefox, Safari and Chrome.
The thing with JPEG is that software to consume it is already out there, and with the advancements in mozjpeg 2.0 it's probably compatible with 95% of any software developed in the last 5 years, if not more. Try switching to WebP and even Android phones from the last 2 years will have trouble if they are stuck at Android 2.3 (Gingerbread).
Not to be pedantic, but both Google and Facebook are, for the most part, free - the only cost involved is allowing ourselves to be data mined and ad-attacked. What possible savings could there be for the average consumer? We aren't talking bmp's here, or RAW, or anything remotely close. As has been stated already, Facebook (and Google) are both well capable of reducing quality by 5%, or 25%, or pick an arbitrary number. Most (as in, almost all of us) won't notice, for the most part.
What Mozilla needs to do is get back to concentrating on its flagship product, removing the recently introduced bloat, actually following industry standards (HTML5 plz - Flash needs to go), and making a product that, at one point, actually was a fair replacement for IE.
I think Wozza was meaning the saving to them, not to us. Someone has to pay for all the storage, server processing, internet traffic from their data centres etc. Reducing that by 5% which still maintaining the quality could be a massive boon for them. Granted reducing the quality by a few percent could have the same outcome without anyone noticing, however photo sharing a key part of their business models now, so compromising quality could result in adverse negative publicity. Facebook or Google announcing a tweak to their software to improve load times, reduce bandwidth yet maintaining quality is a win win for all.
Also remember that an ever increasing proportion of traffic on social networks is through mobile devices that are often memory and processor constrained. Low data caps on mobile connection would also benefit from any reduction of image file sizes.
There are currently 1 users browsing this thread. (0 members and 1 guests)