A data center is the brain of the internet. The engine of the internet. It is a huge building with a great deal of supremacy, lots of cooling and lots of computers. They are rows after rows of machines, all of which work together to provide the services that Make Google work. I adoration building and operating data centre. I’m Joe Kava, Vice President of Data Centers at Google. I am responsible for managing the teams worldwide motif, build and control Google’s data centre. We are too held accountable for our impact on the environment, sustainability and the CO2 releases of our data centre. This South Carolina data center is one node in a larger network of data centers all regions of the world. There is only a very small percentage of all Google works of those employees authorized to access a data center campus. The men and women who run these data centers and to be maintained operational 24 hours per day, seven days a week, are extremely enthusiastic about the performance of their duties. How do you explain to someone in a simple way what I do? I generally say that I be a keeper of cats. I am a technician. I am Hardware Site Operations Manager. We make sure the sunrise stays on. And we enjoy doing that. And they make very hard so we want to provide them with a enjoyable environment wherever they start can tighten. We merely thumped the three million man-hour observe legislated for zero incidents where time is lost. Three million man hours is really, genuinely long and with the number of members of people working on site, that is a major achievement. I consider Google’s data centers genuinely are can provide a stage of security that virtually no other fellowship is available with can accord. We have an datum defence team that is truly unparalleled. Do you know the expression, “They wrote the book about it”? Well, there are many members of our unit for it intelligence insurance who have actually written diaries with practical gratuities for datum protection. Protecting security and privacy our useds’ knowledge is our primary design criterion. We use several layers of higher level security as you get closer to the center of the campus. Just to enter this campus ought to have my badge are on a pre-approved access list. And then to get into the building, come you another level of security against. If you want to enter the secure corridor towards the data center, there is an even higher level of security there. The data centre and the network apartments have the most prominent defence statu. The engineerings we use are different. So we use in the areas with the highest level even underfloor interference detection via laser lights. I am now going to show you how to access the secure hallway directs. First, I have to be on the access register with my button. Second, I have to use a biometric iris scanner to confirm that it is really me. Okay. Here we are, in the data center room. The first thing you notice is that it is a little warm here. It is about 27 measures now. Google is running the data centers warmer than most because it improves efficiency. You will see that we have a power supply along the top. This high voltage supply comes from outside and becomes distributed on the conductors towards all adjusted taps, which are actually plugs which is something we plug in all expansion lines. All of our racks don’t really look like a traditional one server rack. They are custom designed and represented for Google, so that we can optimize the servers for hyper-efficiency and high-performance computing. It certainly happens that drives sometimes disappoint and we need to replace or upgrade it, because they may no longer be efficient. We have a very thorough process for ended chain management to finagle those drives, from the time they are checked out of the server until they are taken to an ultra-secure cage where then there obliterated and crushed if necessary. So if a disc cannot be proclaimed 100% clean-living, it is first vanquished and then carried out in order to an industrial grove chipper to be torn into little portions like this. In the time I’ve been working at Google, almost six and half a year, we have our chill procedures reformed at least five times. Most data centers have air conditioning units along the outer walls blowing cold breeze under the floor. This air then rises in front of the servers and cools the servers. In our solution we take the server racks and we turn it on against our air conditioning unit. We use cold water that spurts through the copper tubes that “youre seeing” there. The hot air from the servers is therefore kept in that warm hallway. It rises and goes down those tubes, where the hot from the breeze is transferred to the liquid in those pipings. Thereafter that hot water is diverted outside the data center where it is cooled in our refrigeration arrangement through our cool pillars and back again to the data center. And that process is recurred all the time. Which surprises me about Google and the data centers is the rate of invention and the channel we work is always challenged. So when people say innovation in some area is completed and we have peaked for what can be achieved I merely laugh.[ MUSIC].
Related posts
-
C# Tutorial For Beginners – Learn C# Basics in 1 Hour
Hi! Thank you for taking my C# tutorial for beginners. Let me quickly give you an... -
WordPress Tutorial: Build Your Affiliate Marketing Sales Funnel
Hi Kevin Barham now and in this shorttutorial I’m gonna show you how to build an... -
Python OOP Tutorial 1: Classes and Instances
Hey, everybody. How’s going in this streaks of videos? We’ll be learning how to create and...