Makes you wonder though, in a time when resources are getting scarer and scarer the value of building such a huge energy hog.
Makes you wonder though, in a time when resources are getting scarer and scarer the value of building such a huge energy hog.
(\___/) (\___/) (\___/) (\___/) (\___/) (\___/) (\___/)
(='.'=) (='.'=) (='.'=) (='.'=) (='.'=) (='.'=) (='.'=)
(")_(") (")_(") (")_(") (")_(") (")_(") (")_(") (")_(")
This is bunny and friends. He is fed up waiting for everyone to help him out, and decided to help himself instead!
It's partical physics which is (probably) going to give us the best solutions for replacing our crude, fossil burning technologies.
Either that or wipe out the world and possibly the universe in an instant.
Either way, let them have their fun with their giant frikkin laser.
Yeah, that system is called CASTOR. It's good and bad. In certain circumstances, you can certainly be waiting for quite a while for your data to be staged on the hard-disk front-end. I guess that's pretty unavoidable though.
Raw data alone - i.e. before it gets turned into more useful physics analysis objects - the CMS experiment that I'm working on will be producing about 10 peta-bytes per year. Once you factor in converting the raw data into more useful formats, add on some Monte-Carlo simulation data, and then some data duplication to 2 or 3 other large centres (such as Fermilab), it'll be pushing 100 peta-bytes per year.
The ATLAS experiment will be producing about the same amount. I'm not sure about the two other smaller experiments, LHCb and ALICE, but let's call them about 100 peta-bytes together.
So... ball-park, the LHC experiments altogether will be producing about 300 peta-bytes of data per year, and we'll be running for about ~10 years before we upgrade things. So, for the lifetime of all current LHC-related CERN experiments, I guess the total dataset will be in the region of 3 exa-bytes.
That's a lot of data to be in one place... any backup ?
or is that the data duplication with fermilab?
bsodmike (10-12-2009)
There isn't enough porn in the world!
Zhaoman (10-12-2009)
Yes, data duplication with other major data centres, such as the one at Fermilab. Basically CERN forms the "Tier 0" data centre, where one copy of the entire dataset will be stored. There are then a number of "Tier 1" data centres, such as at Fermilab in the US and the Rutherford Appleton Laboratory here in the UK. Between all these Tier 1 centres, the data will be duplicated, but not all of it at any single Tier 1 centre. Then there are Tier 2 centres at various universities, which again between them duplicate all the data at each Tier 1. So, it should be fairly safe.
Better get humping (and videoing).
The LHC is now officially the highest energy accelerator in the world, after breaking the previous record in the early hours of this morning:
http://press.web.cern.ch/press/Press.../PR18.09E.html
Congratulations Gordon.... er...Fraz
Originally Posted by Advice Trinity by Knoxville
Originally Posted by Advice Trinity by Knoxville
Oi Fraz!!!
That is all.
sig removed by Zak33
Congrats Fraz et al
Now, *try* not to vaporise most of the franco-swiss border, at least not this side of Christmas, it does vex people when that happens
(\__/)
(='.'=)
(")_(")
There are currently 3 users browsing this thread. (0 members and 3 guests)