Like Yeah

UPDATE Cara Mendownload File Cara nya Cukup mudah yaitu klik tombol Skip Ad / Lewati dan Tunggu 5 Detik .
UPDATE How To Download File His manner was easy enough that click Skip Ad button / Skip and Wait 5 Seconds .
PENTING ! KILK TOMBOL LIKE DAN SUBCRIBE MY BLOOGER DAN FOLLOW MY FACEBOOK AND INSTAGRAM . TERMIAKASIH ANDA TELAH MENGUNJUNGI BLOGGER KECIL INI �� .
IMPORTANT! CLICK LIKE AND SUBSCRIBE MY BLOG AND FOLLOW MY FACEBOOK AND Instagram. THANK YOU FOR VISITING THIS LITTLE BLOGGER �� .
Please help me weeks to promote the blogger page to purchase a .com domain please click on the button below to donate to build a mini website thank you for your attention may god bless you. amen

Help us fill in the gaps!



We've been targeting malware for over a year and a half, and these efforts are paying off. We are now able to display warnings in search results when a site is known to be malicious, which can help you avoid drive-by downloads and other computer compromises. We are already distributing this data through the Safe Browsing API, and we are working on bringing this protection to more users by integrating with more Google products. While these are great steps, we need your help going forward!

Currently, we know of hundreds of thousands of websites that attempt to infect people's computers with malware. Unfortunately, we also know that there are more malware sites out there. This is where we need your help in filling in the gaps. If you come across a site that is hosting malware, we now have an easy way for you to let us know about it. If you come across a site that is hosting malware, please fill out this short form. Help us keep the internet safe, and report sites that distribute malware.

Auditing open source software



Google encourages its employees to contribute back to the open source community, and there is no exception in Google's Security Team. Let's look at some interesting open source vulnerabilities that were located and fixed by members of Google's Security team. It is interesting to classify and aggregate the code flaws leading to the vulnerabilities, to see if any particular type of flaw is more prevalent.
  1. JDK. In May 2007, I released details on an interesting bug in the ICC profile parser in Sun's JDK. The bug is particularly interesting because it could be exploited by an evil image. Most previous JDK bugs involve a user having to run a whole evil applet. The key parts of code which demonstrate the bug are as follows:

    TagOffset = SpGetUInt32 (&Ptr);
    if (ProfileSize < TagOffset)
      return SpStatBadProfileDir;
    ...
    TagSize = SpGetUInt32 (&Ptr);
    if (ProfileSize < TagOffset + TagSize)
      return SpStatBadProfileDir;
    ...
    Ptr = (KpInt32_t *) malloc ((unsigned int)numBytes+HEADER);

    Both TagSize and TagOffset are untrusted unsigned 32-bit values pulled out of images being parsed. They are added together, causing a classic integer overflow condition and the bypass of the size check. A subsequent additional integer overflow in the allocation of a buffer leads to a heap-based buffer overflow.

  2. gunzip. In September 2006, my colleague Tavis Ormandy reported some interesting vulnerabilities in the gunzip decompressor. They were triggered when an evil compressed archive is decompressed. A lot of programs will automatically pass compressed data through gunzip, making it an interesting attack. The key parts of the code which demonstrate one of the bugs are as follows:

    ush count[17], weight[17], start[18], *p;
    ...
    for (i = 0; i < (unsigned)nchar; i++) count[bitlen[i]]++;

    Here, the stack-based array "count" is indexed by values in the "bitlen" array. These values are under the control of data in the incoming untrusted compressed data, and were not checked for being within the bounds of the "count" array. This led to corruption of data on the stack.


  3. libtiff. In August 2006, Tavis reported a range of security vulnerabilities in the libtiff image parsing library. A lot of image manipulation programs and services will be using libtiff if they handle TIFF format files. So, an evil TIFF file could compromise a lot of desktops or even servers. The key parts of the code which demonstrate one of the bugs are as follows:

    if (sp->cinfo.d.image_width != segment_width ||
        sp->cinfo.d.image_height != segment_height) {
      TIFFWarningExt(tif->tif_clientdata, module,
        "Improper JPEG strip/tile size, expected %dx%d, got %dx%d",

    Here, a TIFF file containing a JPEG image is being processed. In this case, both the TIFF header and the embedded JPEG image contain their own copies of the width and height of the image in pixels. This check above notices when these values differ, issues a warning, and continues. The destination buffer for the pixels is allocated based on the TIFF header values, and it is filled based on the JPEG values. This leads to a buffer overflow if a malicious image file contains a JPEG with larger dimensions than those in the TIFF header. Presumably the intent here was to support broken files where the embedded JPEG had smaller dimensions than those in the TIFF header. However, the consequences of larger dimensions that those in the TIFF header had not been considered.

We can draw some interesting conclusions from these bugs. The specific vulnerabilities are integer overflows, out-of-bounds array accesses and buffer overflows. However, the general theme is using an integer from an untrusted source without adequately sanity checking it. Integer abuse issues are still very common in code, particular code which is decoding untrusted binary data or protocols. We recommend being careful using any such code until it has been vetted for security (by extensive code auditing, fuzz testing, or preferably both). It is also important to watch for security updates for any decoding software you use, and keep patching up to date.

Information flow tracing and software testing



Security testing of applications is regularly performed using fuzz testing. As previously discussed on this blog, Srinath's Lemon uses a form of smart fuzzing. Lemon is aware of classes of web application threats and the input families which trigger them, but not all fuzz testing frameworks have to be this complicated. Fuzz testing originally relied on purely random data, ignorant of specific threats and known dangerous input. Today, this approach is often overlooked in favor of more complicated techniques. Early sanity checks in applications looking for something as a simple as a version number may render testing with completely random input ineffective. However, the newer, more complicated fuzz testers require a considerable initial investment in the form of complete input format specifications or the selection of a large corpus of initial input samples.

At WOOT'07,I presented a paper on Flayer, a tool we developed internally to augment our security testing efforts. In particular, it allows for a fuzz testing technique that compromises between the original idea and the most complicated. Flayer makes it possible to remove input sanity checks at execution time. With the small investment of identifying these checks, Flayer allows for completely random testing to be performed with much higher efficacy. Already, we've uncovered multiple vulnerabilities in Internet-critical software using this approach.

The way that Flayer allows for sanity checks to be identified is perhaps the more interesting point. Flayer uses a dynamic analysis framework to analyze the target application at execution time. Flayer marks, or taints, input to the program and traces that data throughout its lifespan. Considerable research has been done in the past regarding information flow tracing using dynamic analysis. Primarily, this work has been aimed at malware and exploit detection and defense. However, none of the resulting software has been made publicly available.

While Flayer is still in its early stages, it is available for download under the GNU Public License. External contributions and feedback are encouraged!

Google Finance in Canada - for real



Though it was noted a bit earlier, now we're really pleased to introduce Google Finance Canada, a localized version of Google Finance tailored specifically, as you might guess, for Canadian investors. Canadians are the second largest group of Google Finance users, and as a Canadian myself, I'm excited to see Canadian financial information presented in the familiar easy to use Google Finance format. This new edition includes:
  • Top financial news from Canadian sources
  • Search with a preference for Canadian companies
  • Front-page high level economic data from the Bank of Canada
  • Portfolios in Canadian currency (or the currency of your choice)
  • Equity data from the Toronto Stock Exchange, TSX Venture Exchange, and Canadian mutual funds

In addition, stock quotes and charts for Canadian-listed companies are now available through the Google.com web search.

Now there's Google Finance for Canada-update



Oops. We hit the button too soon. Watch for news about Google Finance in Canada next Tuesday.

Robots Exclusion Protocol: now with even more flexibility



This is the third and last in my series of blog posts about the Robots Exclusion Protocol (REP). In the first post, I introduced robots.txt and the robots META tags, giving an overview of when to use them. In the second post, I shared some examples of what you can do with the REP. Today, I'll introduce two new features that we have recently added to the protocol.

As a product manager, I'm always talking to content providers to learn about your needs for REP. We are constantly looking for ways to improve the control you have over how your content is indexed. These new features will give you flexible and convenient ways to improve the detailed control you have with Google.

Tell us if a page is going to expire
Sometimes you know in advance that a page is going to expire in the future. Maybe you have a temporary page that will be removed at the end of the month. Perhaps some pages are available free for a week, but after that you put them into an archive that users pay to access. In these cases, you want the page to show in Google search results until it expires, then have it removed: you don't want users getting frustrated when they find a page in the results but can't access it on your site.

We have introduced a new META tag that allows you to tell us when a page should be removed from the main Google web search results: the aptly named unavailable_after tag. This one follows a similar syntax to other REP META tags. For example, to specify that an HTML page should be removed from the search results after 3pm Eastern Standard Time on 25th August 2007, simply add the following tag to the first section of the page:

<META NAME="GOOGLEBOT" CONTENT="unavailable_after: 25-Aug-2007 15:00:00 EST">

The date and time is specified in the RFC 850 format.

This information is treated as a removal request: it will take about a day after the removal date passes for the page to disappear from the search results. We currently only support unavailable_after for Google web search results.

After the removal, the page stops showing in Google search results but it is not removed from our system. If you need a page to be excised from our systems completely, including any internal copies we might have, you should use the existing URL removal tool which you can read about on our Webmaster Central blog.

Meta tags everywhere
The REP META tags give you useful control over how each webpage on your site is indexed. But it only works for HTML pages. How can you control access to other types of documents, such as Adobe PDF files, video and audio files and other types? Well, now the same flexibility for specifying per-URL tags is available for all other files type.

We've extended our support for META tags so they can now be associated with any file. Simply add any supported META tag to a new X-Robots-Tag directive in the HTTP Header used to serve the file. Here are some illustrative examples:
  • Don't display a cache link or snippet for this item in the Google search results:
X-Robots-Tag: noarchive, nosnippet
  • Don't include this document in the Google search results:
X-Robots-Tag: noindex
  • Tell us that a document will be unavailable after 7th July 2007, 4:30pm GMT:
X-Robots-Tag: unavailable_after: 7 Jul 2007 16:30:00 GMT

You can combine multiple directives in the same document. For example:
  • Do not show a cached link for this document, and remove it from the index after 23rd July 2007, 3pm PST:
X-Robots-Tag: noarchive
X-Robots-Tag: unavailable_after: 23 Jul 2007 15:00:00 PST


Our goal for these features is to provide more flexibility for indexing and inclusion in Google's search results. We hope you enjoy using them.

Computer science resources for academics



Google has a long history of involvement with universities, and we're excited to share some recent news on that front with you. At the main Google campus this week we're hosting the Google Faculty Summit, which involves universities all over participating in discussions about what we're up to in research-land as well as computer science education - something very near and dear to us.

Meanwhile, because we know that between teaching, doing research and advising students, computer science educators are quite strapped for time, we've recently launched a site called Google Code for Educators. While you may have previously heard about our offerings for K-12 teachers, this new program is focused on CS topics at the university level, and lets us share the knowledge we've built up around things like distributed systems and AJAX programming. It's designed for university faculty to learn about new computer science topics and include them in their courses, as well as to help curious students learn on their own.

Right now, Google Code for Educators offers materials for AJAX web programming, distributed systems and parallel programming, and web security. The site includes slides, programming labs, problem sets, background tutorials and videos. We're eager to provide more content areas and also more iterations for existing topic areas. To allow for liberal reuse and remixing, most sample course content on Code EDU is available under a Creative Commons license. Please let us know your thoughts on this new site.

Beyond CS education, another important faculty topic is research. Google Research offers resources to CS researchers,including papers authored by Googlers and a wide variety of our tech talks. You might be interested in learning more about MapReduce and the Google File System, two pieces of Google-grown technology that have allowed us to operate at enormous scale. We also recently put together a few university research programs and we're eager to see what academics come up with.

What Eric Schmidt did this summer



In case you're thinking summer is the time to slow down, that's not always true around here. Our CEO has been on the go on behalf of a number of our public policy initiatives. And our Public Policy blog has been keeping up with him.

Earth to the Enterprise



With more than 200 million downloads and counting, Google Earth is known around the world. Less well-known is our Google Earth Enterprise which companies, organizations and government agencies use to view their global data and imagery. Experts and amateurs alike use it for everything from designing new buildings to exploring for energy to responding to emergencies, because Google Earth Enterprise offers access to geospatial info that was once limited to specialty applications.

For instance, check out Dell's implementation showing a geographic view of traffic to Dell.com:



Today, we're releasing the latest version, which makes it easy to publish and view Google Earth datasets in 2D using a browser. By accessing Google Earth Enterprise from a web browser, employees across an organization will benefit from the rich geographic tapestry. There's more detail on the Google Lat/Long blog.

Like making videos? Love Gmail?



A couple weeks back, some of us on the Gmail team were talking about how simple it's become to connect with people around the world through email. And we got to thinking: what if email was delivered via a Rube Goldberg machine, but instead of gears and levers, people on everything from bicycles to submarines brought messages from one place to another? So we had a little fun with a collaborative video depicting just that (well, not the submarines).

Now it's time to let everyone in on the action. Learn more at our new Gmail Blog, or go directly to http://mail.google.com/mvideo where you'll find directions on how to submit your clip.



Calling all SketchUp fans



It's my great pleasure to announce the launch of the Official Google SketchUp Blog. Fast-breaking news, tips and tricks, user stories and just the right amount of office intrigue await anyone who pays us a visit. Also, sexy mustache contests.

For those of you who have no idea what SketchUp is, I'll start at the beginning: The world is three-dimensional. Designing a house, building a piece of furniture and navigating through a city all involve three-dimensional decisions. SketchUp is 3D modeling software that anyone can use to build models of whatever they like.

Check out the 3D Warehouse to see models from people all over the world, and turn on the 3D Warehouse layer in Google Earth to explore cities with realistic 3D buildings made in SketchUp (Denver is particularly impressive). If you like, you can download the free version and start building models yourself.

Our commitment to open broadband platforms



For several years now, many Googlers have been working to identify the obstacles that prevent the Internet from being available to everyone on the planet. It strikes us as unfair that some people should enjoy such abundant access to this rich resource while billions of others aren't so lucky. Though the technology exists today to provide access on a global scale, often we have learned technology isn't the problem. In this context, we have worked hard to advance a set of principles that will make Internet access for all a priority.

For instance, we wrote last week on our Public Policy Blog about Google's interest in promoting competition in the broadband market here in the U.S., to help ensure that as many Americans as possible can access the Internet. However, it takes more than just ideas and rhetoric if you want to help bring the Internet to everyone.

So today, we're putting consumers' interests first, and putting our money where our principles are -- to the tune of $4.6 billion. Let me explain.

In the U.S., wireless spectrum for mobile phones and data is controlled by a small group of companies, leaving consumers with very few service providers from which to choose. With that in mind, last week, as the federal government prepares for what is arguably its most significant auction of wireless spectrum in history, we urged the Federal Communications Commission (FCC) to adopt rules to make sure that regardless of who wins the spectrum at auction, consumers' interests are the top priority. Specifically, we encouraged the FCC to require the adoption of four types of "open" platforms as part of the auction:
  • Open applications: consumers should be able to download and utilize any software applications, content, or services they desire;
  • Open devices: consumers should be able to utilize their handheld communications device with whatever wireless network they prefer;
  • Open services: third parties (resellers) should be able to acquire wireless services from a 700 MHz licensee on a wholesale basis, based on reasonably nondiscriminatory commercial terms; and
  • Open networks: third parties (like Internet service providers) should be able to interconnect at any technically feasible point in a 700 MHz licensee's wireless network.
As numerous public interest organizations noted earlier this week, all four of these conditions adopted together would promote a spirit of openness, and could spur additional forms of competition from web-based entities, such as software applications providers, content providers, handset makers, and ISPs. The big winners? Consumers. As choices increase, prices come down and more Americans have access to the Net.

The FCC is currently considering draft rules for the auction, and the reports we've heard are that those rules include some -- but not all four -- of the openness conditions that we and consumer groups support. While any embrace of open platforms is welcome, only if the FCC adopts all four principles will we see the genuinely competitive marketplace that Americans deserve. In particular, guaranteeing open services and open networks would ensure that entrepreneurs starting new networks and services will have a fair shot at success, in turn giving consumers a wider choice of broadband providers.

There are some who have claimed that embracing these principles and putting American consumers first might somehow devalue this spectrum. As much as we don't believe this to be the case, actions speak louder than words. That's why our CEO Eric Schmidt today sent a letter to FCC Chairman Kevin Martin, saying that, should the FCC adopt all four license conditions requested above, Google intends to commit at least $4.6 billion to bidding for spectrum in the upcoming 700 Mhz auction.

Why $4.6 billion? While we think that a robust and competitive auction based on these four principles will likely produce much higher bids, and we are eager to see a diverse set of bidders competing, $4.6 billion is the reserve price that FCC has proposed for the auction. With any concerns about revenue to the U.S. Treasury being satisfied, we hope the FCC can return its attention to adopting openness principles for the benefit of consumers.

In the meantime, thank you to those who have reached out to help with our efforts. It feels good to see how many of you support true competition for the benefit of consumers and we look forward to hearing from even more of you in the days to come.

For now, and for all of us, the issue is simple: this is one of the best opportunities we will have to bring the Internet to all Americans. Let's seize that opportunity.

Note: We've cross-posted this to our Public Policy Blog.

Your Campus in 3D winners announced



The results are in for the winners of the Build Your Campus in 3D Competition, which we announced in January. The judges chose 7 teams from among the dozens who submitted more than 4,000 buildings from colleges and universities all over North America. And the winning school teams who will be joining us in Mountain View are:

University of Minnesota | Twin Cities, Minnesota
Purdue University | West Lafayette, Indiana
Concordia University, Montreal, Quebec
Indiana University - Purdue University Fort Wayne | Fort Wayne, Indiana
Franklin W. Olin College of Engineering | Needham, Massachusetts
Dartmouth College | Hanover, New Hampshire
Stanford University | Stanford, California

Check out the competition site to see more details about the judges, the rules, the winners, and what they won. From there, you can follow a link to see the winning campuses in your copy of Google Earth. Again, congrats to the winning teams, and a big thank you to everyone who participated.

Opening up Google Print Ads



Even with the growth of online news sites, Americans still read newspapers. Over the course of a typical week, nearly 3 out of 4 adults (115 million) in the top 50 markets read a copy of a daily or Sunday newspaper.* That's why thousands of businesses use print advertising every day to reach a local audience, and why we've announced that we're extending Google AdWords to newspapers for most U.S. advertisers. To learn more, visit the Google Print Ads™ site, or read about it on the Inside AdWords blog.

*Scarborough Research USA, Release 2, 2006.

Hosted site search for businesses

Nitin Mangtani, Product Manager, Enterprise Search and Rajat Mukherjee, Group Product Manager, Search

Businesses spend a lot of effort and energy creating and promoting great websites for their products and services, but quality search is often missing. As a result, businesspeople often ask us why they can't use Google to power search on their sites.

Today we've released Custom Search Business Edition (CSBE) to do just that. CSBE is a hosted site search solution that provides Google-quality results for your website. It's fast, relevant, reliable, and flexible, so that users can quickly find what they're looking for through search results customized and integrated into your business website.

CSBE builds on the Google Custom Search Engine, a hosted search solution we introduced last October that allows organizations to create a search engine and search results that are tailored to their point of view. All well and good, but businesses have asked us for greater flexibility and support -- and we're addressing these needs with CSBE. Businesses that want further control over results presentation and integration with their website can obtain results through XML. Now those of you with business sites have the option to turn off ads and have further control over branding. In addition, CSBE provides options for email and phone support. The pricing starts at $100 per year for searching up to 5,000 pages.

This offering should be a great help to the millions of businesses that have a web presence but don't offer users any way to search the site. Instead of being left on their own to navigate content, visitors to CSBE-enabled sites will be able to navigate through search results without ever leaving the site. We hope an improved customer search experience will translate into more referrals, more opportunities for e-commerce, and more satisfied online customers for these businesses.Here's more about CSBE.

Automating web application security testing



Cross-site scripting (aka XSS) is the term used to describe a class of security vulnerabilities in web applications. An attacker can inject malicious scripts to perform unauthorized actions in the context of the victim's web session. Any web application that serves documents that include data from untrusted sources could be vulnerable to XSS if the untrusted data is not appropriately sanitized. A web application that is vulnerable to XSS can be exploited in two major ways:

    Stored XSS - Commonly exploited in a web application where one user enters information that's viewed by another user. An attacker can inject malicious scripts that are executed in the context of the victim's session. The exploit is triggered when a victim visits the website at some point in the future, such as through improperly sanitized blog comments and guestbook entries, which facilitates stored XSS.

    Reflected XSS - An application that echoes improperly sanitized user input received as query parameters is vulnerable to reflected XSS. With a vulnerable application, an attacker can craft a malicious URL and send it to the victim via email or any other mode of communication. When the victim visits the tampered link, the page is loaded along with the injected script that is executed in the context of the victim's session.

The general principle behind preventing XSS is the proper sanitization (via, for instance, escaping or filtering) of all untrusted data that is output by a web application. If untrusted data is output within an HTML document, the appropriate sanitization depends on the specific context in which the data is inserted into the HTML document. The context could be in the regular HTML body, tag attributes, URL attributes, URL query string attributes, style attributes, inside JavaScript, HTTP response headers, etc.

The following are some (by no means complete) examples of XSS vulnerabilities. Let's assume there is a web application that accepts user input as the 'q' parameter. Untrusted data coming from the attacker is marked in red.

  • Injection in regular HTML body - angled brackets not filtered or escaped

    <b>Your query '<script>evil_script()</script>' returned xxx results</b>

  • Injection inside tag attributes - double quote not filtered or escaped

    <form ...
      <input name="q" value="blah"><script>evil_script()</script>">
    </form>

  • Injection inside URL attributes - non-http(s) URL

    <img src="javascript:evil_script()">...</img>

  • In JavaScript context - single quote not filtered or escaped

    <script>
      var msg = 'blah'; evil_script(); //';
      // do something with msg variable
    </script>


In the cases where XSS arises from meta characters being inserted from untrusted sources into an HTML document, the issue can be avoided either by filtering/disallowing the meta characters, or by escaping them appropriately for the given HTML context. For example, the HTML meta characters <, >, &, " and ' must be replaced with their corresponding HTML entity references &lt;, &gt;, &amp;, &quot; and &#39 respectively. In a JavaScript-literal context, inserting a backslash in front of \, ', " and converting the carriage returns, line-feeds and tabs into \r, \n and \t respectively should avoid untrusted meta characters being interpreted as code.

How about an automated tool for finding XSS problems in web applications? Our security team has been developing a black box fuzzing tool called Lemon (deriving from the commonly-recognized name for a defective product). Fuzz testing (also referred to as fault-injection testing) is an automated testing approach based on supplying inputs that are designed to trigger and expose flaws in the application. Our vulnerability testing tool enumerates a web application's URLs and corresponding input parameters. It then iteratively supplies fault strings designed to expose XSS and other vulnerabilities to each input, and analyzes the resulting responses for evidence of such vulnerabilities. Although it started out as an experimental tool, it has proved to be quite effective in finding XSS problems. Besides XSS, it finds other security problems such as response splitting attacks, cookie poisoning problems, stacktrace leaks, encoding issues and charset bugs. Since the tool is homegrown it is easy to integrate into our automated test environment and to extend based on specific needs. We are constantly in the process of adding new attack vectors to improve the tool against known security problems.

Update:
I wanted to respond to a few questions that seem to be common among readers. I've listed them below. Thanks for the feedback. Please keep the questions and comments coming.

Q. Does Google plan to market it at some point?
A. Lemon is highly customized for Google apps and we have no plans of releasing it in near future.

Q. Did Google's security team check out any commercially available fuzzers? Is the ability to keep improving the fuzzer the main draw of a homegrown tool?
A. We did evaluate commercially available fuzzers but felt that our specialized needs could be served best by developing our own tools.

Cookies: expiring sooner to improve privacy



We are committed to an ongoing process to improve our privacy practices, and have recently taken a closer look at the question of cookie privacy. How long should a web site "remember" cookie information in its logs after a user's visit? And when should a cookie expire on your computer? Cookie privacy is both a server and a client issue.

On the server side, we recently announced that we will anonymize our search server logs — including IP addresses and cookie ID numbers — after 18 months.

Now, we're asking the question about cookie lifetime: when should a cookie expire on your computer? For background: a cookie is a very small file which gets stored on your computer All search engines and most websites use cookies. Why? Cookies remind us of your preferences from the last time you visited our site. For example, Google uses our so-called "PREF cookie" to remember our users’ basic preferences, such as the fact that a user wants search results in English, no more than 10 results on a given page, or a SafeSearch setting to filter out explicit sexual content. When we originally designed the PREF cookie, we set the expiration far into the future — in 2038, to be exact — because the primary purpose of the cookie was to preserve preferences, not to let them be forgotten. We were mindful of the fact that users can always go to their browsers to change their cookie management settings, e.g. to delete all cookies, delete specific cookies, or accept certain types of cookies (like first-party cookies) but reject others (like third-party cookies).

After listening to feedback from our users and from privacy advocates, we've concluded that it would be a good thing for privacy to significantly shorten the lifetime of our cookies — as long as we could find a way to do so without artificially forcing users to re-enter their basic preferences at arbitrary points in time. And this is why we’re announcing a new cookie policy.

In the coming months, Google will start issuing our users cookies that will be set to auto-expire after 2 years, while auto-renewing the cookies of active users during this time period. In other words, users who do not return to Google will have their cookies auto-expire after 2 years. Regular Google users will have their cookies auto-renew, so that their preferences are not lost. And, as always, all users will still be able to control their cookies at any time via their browsers.

Together, these steps — logs anonymization and cookie lifetime reduction — are part of our ongoing plan to continue innovating in the area of privacy to protect our users.

Nonprofits mix it up with Google Apps

Search your blog world

Rajat Mukherjee, Group Product Manager

What happens when you put together a popular blogging platform, a customizable search experience and a flexible search API? You get a Search Box widget for Blogger, built using the AJAX Search API, and powered by a Linked Custom Search Engine (CSE).

Configure this widget on your Blogger blog and you can immediately search not just your blog posts, but across all the link lists/blogrolls you've set up on your blog and the links you've made from your posts.

The widget is now available on Blogger in Draft, Blogger's experimental site. Once you've logged in and configured the widget, visitors to your blog will see a search box there. The search experience inherits your blog's look and feel, and is uniquely flavored around pages you've linked to from your blog.

To add the widget:
  1. Edit your blog's layout.
  2. Click on "Add a page element" and configure the "Search Box" widget.
Your link lists will automatically show up as optional tabs for your search; you can decide which ones you want to configure. Go ahead -- custom-search-enable your blog!

Update: Re-posted with copy written for this blog.

Overview of our accessible services



From time to time, our own T.V. Raman shares his tips on how to use Google from his perspective as a technologist who cannot see -- tips that sighted people, among others, may also find useful. - Ed.

We provide a wide variety of services that are mostly accessed with a web browser. People visit Google from a large number of browsers and platforms; in addition, we also understand that every user is special and may have special needs. Accessibility at Google is about making sure that our services work well for all our users -- independent of your needs and abilities at any given time.

Web search, our primary service, has a very simple interface and has always been accessible from a variety of user environments. Newer services that present highly interactive interfaces continue to present accessibility challenges when used with specialized adaptive technologies such as screenreaders. We are committed to finding accessibility solutions that make our services work better for everyone who visits.

Here's a list of our accessibility-related services and a few solutions to some accessibility challenges.

  • Web Search: Result pages include headers to delineate logical sections.
  • Accessible Search: Promotes results that are accessible.
  • Book Search: Full-text access to public-domain works.
  • Gmail: A simple yet functional HTML mode that works well with screenreaders.
  • Gmail Mobile: A lightweight user interface that is also speech-friendly.
  • Google Maps: Easy-to-use textual directions.
  • Calendar: A functional, yet speech-friendly user interface.
  • Audio Captchas: All services that use Google Accounts provide an audio alternative for the visual challenge-response tests that are used to distinguish humans from machines.
  • Mobile Transcoder: A mobile lens for viewing the web that produces accessible views.
  • Google Video: Allows uploaded videos to contain captions/subtitles in multiple languages for viewers who are hearing-impaired or unfamiliar with the original language.
  • Google Talk: IM clients inside a web browser can pose accessibility challenges, but the use of the open Jabber API means that Google users can choose from a variety of Jabber clients, many of which work well with adaptive technologies.
  • Web APIs: Many Google services offer high-level web APIs that aid in authoring mashups; this provides a means for creating highly customized accessible views.
  • 1-800-GOOG-411: Here's an exception to the rule that we deliver most things through a web browser. Our experimental Voice Local Search service lets anyone who can speak into a phone search for a local business by name or category; get connected to the business free of charge; get the details by SMS if you’re using a mobile phone. (Just say "text message".)
Finally, many Google services such as Google Scholar, Google News, Blogger and Google Product Search work out of the box. While today's screenreaders can hit some bumps on the road when using more advanced features in these products, these web interfaces degrade gracefully to provide a functional interface.

If any of this interests you, we invite you to participate in our user community. Please tell us what works well, share your own tips on using Google services, and make sure to tell us what could be made even better.

Update: Added info on 1-800-GOOG-411.


Bloggin' down under



To our international friends, Australia can have a rep for Crocodile Dundee jokes, poisonous animals and meat pies with mushy peas. What you'll find when you visit the new (ish) Google Australia Blog are the real reasons why Google has made a significant investment in Australia. And right now there's a post explaining the special doodle that was on the Google Australia homepage for NAIDOC Week.

Both an engineering and sales support hub, our Aussie team is an eclectic mix of people from all walks of life. We're lucky enough to be the home of the original Google Maps team, so we're at the forefront of global product releases. Aussie Googlers don't take ourselves too seriously, we love a good laugh, we're always happy to make fools of ourselves for a good cause (so long as we're beating the Poms in the Ashes).

Visit our blog to read about new product launches for our Australian users, musings on life as an Aussie Googler and what the team gets up to in the community. (Also, stop by if you want to see words like "centre", "maximise" and "humour" spelt correctly). We hope to see you there!

The reason behind the "We're sorry..." message




Some of you might have seen this message while searching on Google, and wondered what the reason behind it might be. Instead of search results, Google displays the "We're sorry" message when we detect anomalous queries from your network. As a regular user, it is possible to answer a CAPTCHA - a reverse Turing test meant to establish that we are talking to a human user - and to continue searching. However, automated processes such as worms would have a much harder time solving the CAPTCHA. Several things can trigger the sorry message. Often it's due to infected computers or DSL routers that proxy search traffic through your network - this may be at home or even at a workplace where one or more computers might be infected. Overly aggressive SEO ranking tools may trigger this message, too. In other cases, we have seen self-propagating worms that use Google search to identify vulnerable web servers on the Internet and then exploit them. The exploited systems in turn then search Google for more vulnerable web servers and so on.  This can lead to a noticeable increase in search queries and sorry is one of our mechanisms to deal with this.

At ACM WORM 2006, we published a paper on Search Worms [PDF] that takes a much closer look at this phenomenon. Santy, one of the search worms we analyzed, looks for remote-execution vulnerabilities in the popular phpBB2 web application. In addition to exhibiting worm like propagation patterns, Santy also installs a botnet client as a payload that connects the compromised web server to an IRC channel. Adversaries can then remotely control the compromised web servers and use them for DDoS attacks, spam or phishing. Over time, the adversaries have realized that even though a botnet consisting of web servers provides a lot of aggregate bandwidth, they can increase leverage by changing the content on the compromised web servers to infect visitors and in turn join the computers of compromised visitors into much larger botnets. This fundamental change from remote attack to client based download of malware formed the basis of the research presented in our first post. In retrospect, it is interesting to see how two seemingly unrelated problems are tightly connected.

Welcome, Postini team



We launched Google Apps so that it would be easier for employees to communicate and share information while reducing the hassles and costs associated with enterprise software. Companies are responding: every day, more than 1,000 small businesses sign up for Google Apps.

Larger enterprises, however, face a challenge: though they want to deliver simple, useful hosted applications to their employees, they're also required to support complex business rules, information security mandates, and an array of legal and corporate compliance issues. In effect, many businesses use legacy systems not because they are the best for their users, but because they are able to support complex business rules. This isn't a tradeoff that any business should have to make.

We realized that we needed a more complete way to address these information security and compliance issues in order to better support the enterprise community. That's why we're excited to share the news that we've agreed to acquire Postini, a company that offers security and corporate compliance solutions for email, IM, and other web-based communications. Like Google Apps, Postini's services are entirely hosted, eliminating the need to install any hardware or software. A leader in its field, Postini serves more than 35,000 businesses and 10 million users, and was one of our first partners for Google Apps. Their email and IM management services include inbound and outbound policy management, spam and virus protection, content filtering, message archiving, encryption, and more. We will continue to support Postini's customers and we look forward to the possibilities ahead.

Here's the press release announcing the deal, and there's more detail in our FAQ and on the Enterprise blog.

Ever more books to read



From time to time, our own T.V. Raman shares his tips on how to use Google from his perspective as a technologist who cannot see -- tips that sighted people, among others, may also find useful.

As a book lover, I couldn't have been more excited by the advent of electronic books in the early 90s. And with the rise of the Internet, the possibility of being able to discover books online was really exciting.

I work on a project at Google called Google Accessible Search, which helps promote results that are more accessible to visually impaired users. Building on that work is today's release of accessible public domain works through Google Book Search. It's opening up hundreds of thousands of books to people who use adaptive technologies such as speech output, screen readers, and Braille displays.

I'm extremely enthusiastic about many of our efforts at Google, which ultimately have the potential to make the experience of accessing information for visually impaired users just as easy as it is for those with sight. My reading used to be determined by what was available in a form I could read. But today it's a question of using Google effectively so that I can find the right thing to read. Today's Book Search launch is an excellent example of how technology is truly democratizing access to the world's information.

If you have feedback about Google's accessibility services, join our group to share tips on what works well and what could be made better.

All aboard



We're pleased to announce that we have acquired GrandCentral Communications, a company that provides services for managing your voice communications. GrandCentral is an innovative service that lets users integrate all of their existing phone numbers and voice mailboxes into one account, which can be accessed from the web. We think GrandCentral's technology fits well into Google's efforts to provide services that enhance the collaborative exchange of information between our users.

GrandCentral offers many features that complement the phone services you already use. If you have multiple phone numbers (e.g., home, work, cell), you get one phone number that you can set to ring all, some, or none of your phones, based on who's calling. This way, your phone number is tied to you, and not your location or job. The service also gives you one central voice mailbox. You can listen to your voicemails online or from any phone, forward them to anybody, add the caller to your address book, block a caller as spam, and a lot more. You can even listen in on voicemail messages from your phone while they are being recorded, or switch a call from your cell phone to your desk phone and back again. All in all, you'll have a lot more control over your phones.

We're really excited to welcome the GrandCentral team to Google. While we're moving their technology over to Google's network, a limited number of invitations will be available to register for a GrandCentral beta account. If you have a U.S. telephone number, you can sign up for an invitation at www.grandcentral.com. Current GrandCentral customers will continue to have uninterrupted access to the service.

Google and health care



In a world of 24/7 news cycles, a summer weekend can bring considerable -- and unanticipated -- excitement. Take for example the reaction we've just seen to an item on our new health advertising blog. Frankly, we were surprised by the pickup, but perhaps we shouldn't have been. We've been proponents of corporate blogging for some time, despite the significant communication challenges that obviously arise from having many voices from all parts of our company speak publicly through blog posts. In this case, the blog criticized Michael Moore's new film "Sicko" to suggest how health care companies might use our ad programs when they face controversy. Our internal review of the piece before publication failed to recognize that readers would -- properly, but incorrectly -- impute the criticisms as reflecting Google's official position. We blew it.

In fact, Google does share many of the concerns that Mr. Moore expresses about the cost and availability of health care in America. Indeed, we think these issues are sufficiently important that we invited our employees to attend his film (nearly 1,000 people did so). We believe that it will fall to many entities -- businesses, government, educational institutions, individuals -- to work together to solve the current system's shortcomings. This is one reason we're deploying our technology and our expertise with the hope of improving health system information for everyone who is or will become a patient. Over the last several months, we have been blogging about our thinking in this area. See: November 30, 2006, March 28, May 23, and June 14, 2007.

In the meantime, we have taken steps on our own to address the failures we see in our health care system. In our case, the menu of health care options that we offer our employees includes both direct services (for example, on-site medical and dental professionals in certain locations) as well as a range of preventive care programs. It's one of the ways we're attempting to demonstrate corporate responsibility on a major issue of our time.

1-800-GOOG-411: now with maps



In case you hadn't heard, a few months back we launched 1-800-GOOG-411 (1-800-466-4411) in the U.S. It's a free telephone service that lets you search for businesses by voice and get connected to those businesses for free.

Today, your GOOG-411 experience just got better: during your call to GOOG-411, just say "map it", and you'll get a text message with the details of your search plus a link to a map of your results right on your mobile phone.

Try it out, and add us to your phone book while you're at it. Let us know what you think either by emailing us or by joining our discussion group.

Google Desktop now available for Linux



Just a few months after Google Desktop became available for the Mac, I'm happy to tell you it's now available for Linux users too. Google Desktop for Linux makes searching your computer as easy as searching the web with Google. Not only can you rediscover important documents that have been idling on your hard drive for years, but you can also search through emails saved in Gmail or other applications. All office files, including documents and slides created with OpenOffice.org can be easily found. Since some Linux users are program developers, Google Desktop was designed with the ability to search source code and information contained in .pdf, .ps, .man and .info documents. It also features the Quick Search Box ,which you can call up by pressing the Ctrl key twice. Type a few letters or words into the search box and your top results pop up instantly. Keeping with a global focus, you can use it in English, French, Italian, German, Spanish, Portuguese, Dutch, Chinese, Japanese, and Korean --and it works with many versions of Linux too.

With this launch, Google Desktop is now available for Windows, Mac, and Linux. Try it out now and read more on the Google Desktop Blog.

Ga-Ga for Gadgets



Sometimes I think I know a lot. I can code like a champ and also know the difference between a Monet and a Manet. But on closer inspection, maybe I don't know very much at all. When it comes to fine wines, for instance, I can't tell the difference between Châteauneuf-du-Pape and Chateau-de-Cardboard, and if you asked me who played in the Super Bowl last year, I'd probably say the Dolphins. And lots of people at Google are like me: we know some things, and have some good ideas, but we certainly don't know everything or have all the good ideas.

So when we designed iGoogle, our personalized homepage, we baked that recognition right in to the product by developing the Google Gadgets API. Google Gadgets are applications that developers can create and anyone can embed into their iGoogle homepage or their own website. In the year and a half since we launched Google Gadgets, we've seen a lot of growth in this program. The developer community has created thousands of gadgets, and the top gadgets get tens of millions of pageviews per week. This is great for our developers, as iGoogle gives the gadgets broad distribution, and it's great for our iGoogle users, as they benefit from a richer variety of options for their personalized homepage. There have been some really interesting gadgets created, from to-do lists to Zelda, from a pair of eyes that follow your mouse around the screen to an entire customer relationship management (CRM) application.

We've been hearing from a lot of gadget developers that they'd like to spend more time developing if they could, and we've been thinking about ways to help them do that. To that end, we're happy to announce Google Gadget Ventures, a new pilot program that will help fund third-party gadget development and gadget-related businesses. We plan to offer two types of funding: $5,000 grants for gadget developers who want to invest time making their already successful gadget even better, and $100,000 seed investments for new gadget-related businesses. For now, applications are restricted to gadget developers who have more than 250,000 pageviews per week on their gadget.

Our hope with Google Gadget Ventures is to help create an ecosystem where developers can spend more time doing what they love -- building great gadgets. You'll find more details on how to apply on Tom's post on the Google Code Blog and the Google Gadget Ventures web page. I'm extremely excited to see what you all come up with!

New advisory group on health



Every day, people use Google to learn more about an illness, drug, or treatment, or simply to research a condition or diagnosis. We want to help users make more empowered and informed healthcare decisions, and have been steadily developing our ability to make our search results more medically relevant and more helpful to users.

Although we have some talented people here with extensive backgrounds in health policy and technology, this is an especially complex area. We often seek expertise from outside the company, and health is no exception. We have formed an advisory council, made up of healthcare experts from provider organizations, consumer and disease-based groups, physician organizations, research institutions, policy foundations, and other fields. The mission of the Google Health Advisory Council is broadly to help us better understand the problems consumers and providers face every day and offer feedback on product ideas and development. It's a great privilege for us to work with this esteemed group

Google Health Advisory Council
(Institutions or affiliations are listed for identification purposes only.)

Chairman
Dean Ornish, M.D., Founder and President, Preventive Medicine Research Institute, Clinical Professor of Medicine, University of California, San Francisco

Douglas Bell, M.D., Ph.D., Research Scientist, RAND Health, RAND Corporation

Delos M. Cosgrove, M.D., Chief Executive Officer, The Cleveland Clinic

Molly Coye, M.D., M.P.H., Chief Executive Officer, HealthTech

Dan Crippen, Former Congressional Budget Office Director & Reagan White House Assistant

Linda M. Dillman, Executive Vice President, Risk Management, Benefits and Sustainability, Wal-Mart

John Halamka M.D., M.S., Chief Information Officer, Beth Israel Deaconess Medical Center & Harvard Medical School and Chairman, Healthcare Information Technology Standards Panel (HITSP)

Bernadine Healy M.D., Former head of the National Institutes of Health (NIH), Health Editor & Columnist, U.S. News & World Report

Bernie Hengesbaugh, Chief Operating Officer, The American Medical Association (AMA)

Douglas E. Henley, M.D., F.A.A.F.P., Executive Vice President, American Academy of Family Physicians (AAFP)

David Kessler, M.D.,Former FDA Commissioner, Vice Chancellor-Medical Affairs & Dean, School of Medicine, UCSF

John Lumpkin M.D, Senior Vice President, Director of Health Care Group, Robert Wood Johnson Foundation

John Rother, Group Executive Officer of Policy & Strategy, AARP

Anna-Lisa Silvestre, Vice President, Online Services, Kaiser Foundation Health Plan, Inc.

Greg Simon, J.D., President, FasterCures

Mark D. Smith, M.D., MBA, President & Chief Executive Officer, The California HealthCare Foundation

Paul Tang, M.D., Internist & Vice President, Chief Medical Information Officer, Palo Alto Medical Foundation (PAMF) & Chairman, Board of Directors, American Medical Informatics Association (AMIA)

Sharon Terry, M.A., President & Chief Executive Officer, Genetic Alliance

John Tooker, M.D., MBA, F.A.C.P., Executive Vice President & Chief Executive Officer, American College of Physicians (ACP)

Doug Ulman, President, Lance Armstrong Foundation

Robert M. Wachter, M.D., Professor of Medicine, University of California-San Francisco (UCSF); Associate Chairman, UCSF Department of Medicine; Chief of the Medical Service, UCSF Medical Center

Matthew Zachary, Cancer Patient Advocate, Founder & Executive Director, The I'm Too Young for This! Cancer Foundation for Young Adults

Update: Added links to two more bios.

9,000 and counting



This month, we passed the 9,000 mark for enterprise buyers of the Google Search Appliance and the Google Mini. That's a great beginning, but we want to reach out even farther, which is why we're embarking on a partnership with Ingram Micro, one of the largest global distributors of technology products in the world. Ingram has extensive reseller relationships that can help us deliver the power of search behind the firewall to businesses of all sizes, more efficiently and at a larger scale than we could on our own.

Both the Google Mini and the Google Search Appliance are available immediately to qualified Ingram Micro solution providers in the U.S., with plans for a phased rollout in other regions through the end of 2007.

Contact us here to obtain a Google Mini or here for a Google Search Appliance.

Put your photos on a map, and Picasa on your phone



If you've ever seen a great picture and wondered where it was, wished you could visit that exact spot yourself, or found yourself itching to share a great photo with somebody -- but you were away from a computer, we've got two new features on Picasa Web Albums to help you out. First, we're excited to let you know about 'Map My Photos' -- it lets you show exactly where you took your favorite snapshots. When you share an album with friends, they can see your best photos arrayed on a map (or even Google Earth). It's the perfect way to showcase a memorable road trip or a globe-trotting vacation.

Here's how to get started: when you create a new album, just fill in the optional 'Place Taken' field. You can even drag and drop individual photos directly onto a map, and use built-in Google Maps technology to pinpoint exactly where each was shot. For a quick peek at what the results look like, check out our test gallery.

But wait! There's more. We're also launching the first version of Picasa Web Albums built specifically for mobile devices. You already have a couple of pictures stuffed in your wallet, and maybe even a few wallpapers stored on your phone. But what about all those snapshots you can't carry around? With Picasa Web Albums for mobile devices, your favorite pictures are always with you. So next time you're at a loss for words when describing just how awesome, cute, or beautiful something really was, just grab your phone for visual backup.

Of course, the mobile version of Picasa Web Albums lets you keep track of photo updates from friends and family, too. Just click 'My Favorites' from the main screen to see the latest photo albums that your contacts have posted to Picasa Web Albums -- you can even post a quick comment on their photos, using your phone. Thumbnails and photos are automatically re-sized for your device's screen, so pictures look good and download fast. All you need to get started is a phone with a web browser and a data plan; learn more here.

As you enjoy your summer travels, remember to take plenty of snaps, and share the most beautiful places in the world (and don't forget to use your phone to show off pics from back home!).

More organizing tools



We collaborate using Google Docs & Spreadsheets so often at work that I now have more than 300 online documents. My project teams create shared documents and spreadsheets for everything: taking notes in meetings, planning product launches, analyzing usability studies, and much more. I also share docs with friends at work to plan baseball outings, and my fiancée and I are using a shared spreadsheet to help manage the guest list for our upcoming wedding. In other words, I'm one of many with a desperate need to organize all my online documents. Thankfully, I got the chance to design a new interface for Google Docs & Spreadsheets that includes folders and some convenient ways to quickly manage and access all my documents (and if you're like me, your own collection of online docs and spreadsheets is growing daily).

Now when you sign in, you'll see a new interface that lets you create personal folders for each of your projects, and drag your online documents and spreadsheets into them. On the lefthand side, you'll see a list of all the people you are collaborating with; click on any name to see all the files you're working on with them. To read more about this new interface, head over to the Docs & Spreadsheets blog.

The wedding planning continues -- but at least all the docs I need are now easier to find in a folder. Hope your own organizing is easier now too.

Introducing Google Earth Outreach



When Google Earth launched two years ago, it was fun to see that many people around the world used it to fly to their homes, navigate around their neighborhoods, and explore the planet. But when, in September 2005, it was used to rescue stranded victims in the aftermath of Katrina, we realized that Google Earth had the potential to be a significant tool beyond personal exploration. We began to see public-benefit KMLs created for things like environmental protection and global public health. A large number of non-profit groups started contacting us, asking good questions: can Google Earth help us illustrate our projects in a new and more compelling manner than text and slideshows? Are there methods or tools for importing our existing data into Google Earth? Can you tell us about any other non-profits who’ve been successful at using GE to reach a new audience, raise awareness, gain volunteers, inspire people into action, and create a tangible impact?

We listened carefully and worked on this for more than a year, and now, the answer is “Yes!” Today we're formally launching Google Earth Outreach, a program designed to empower non-profit groups with the resources, tools, and inspiration that they need to leverage the power of Google Earth for their cause. This is where public service groups can find online guides and video tutorials, inspiring case studies and a gallery of high-quality, public-benefit KML. We are offering free Google Earth Pro licenses to qualified non-profit 501(c)(3) organizations. And the Earth Outreach team is also moderating a forum to foster discussion, exchange ideas, and give technical support.

We're excited to see the birth of Google Earth Outreach, and it's truly an honor for us to be able to support the critically important work of these groups. As Dr. Jane Goodall said, "Only if we understand can we care. Only if we care will we help. With Google Earth Outreach, more people have the chance to see, to care, and then to act."

Update: Here's some video from our Google Earth Outreach event on Tuesday. Enjoy!

Why we're buying DoubleClick



In April we announced that we're buying DoubleClick, a leading company in the ad serving business. When we made this announcement, we gave some of our reasons. But because online advertising is complicated, I thought I'd step back a bit and offer some more context. If you're an expert, please bear with me, as some of what follows will seem elementary to those already familiar with the online advertising world. If you're not, I hope this gives you a better understanding of how advertisers, publishers, ad serving companies, agencies and other companies such as Google all fit into this exciting new mix.

A little history
In the earliest years, online ads were simple banner ads on websites. Advertisers would purchase these banner ads for those sites their customers would likely visit. A tire company, for example, would place banner ads on sites for automobile enthusiasts.

An innovation followed: Text-based ads targeted at search. Type “drip irrigation” into a search engine and up pop ads, or “sponsored links,” to gardening service and supply companies. This development made online advertising accessible to small advertisers for the first time. According to a May 2007 IAB (Interactive Advertising Bureau) study called the "Internet Advertising Revenue Report," text-based search ads now account for 40 percent of online ads. Google, Yahoo! and MSN are the leaders in managing this category of text-based ads.

The same IAB study notes that display ads account for roughly another 40 percent of online ad sales. Unlike text ads, these may incorporate 3-D graphics, full-motion video, sound and user interactivity. And the remaining 20 percent consists of other categories such as email, classified and lead-generation ads.

Three portals – AOL, Yahoo! and MSN – lead the industry in display ads. Each has more than $1 billion in annual display ad revenue. Content sites such as CNET and ESPN.com are also in the game. Google, however, has been a minor player in display advertising.

Meanwhile, ad serving companies such as DoubleClick, Atlas, and MediaPlex have been helping advertisers get their ads onto these sites and measure how effective the ads are. Since Google has never played in this space, acquiring DoubleClick will enable us to complement our search and content-based advertising capabilities. Its products and technologies will help to improve online advertising for consumers, advertisers and publishers.

By enabling our AdSense network to work with DoubleClick’s delivery mechanisms, for example, advertisers can obtain more precise metrics in order to judge the effectiveness of their campaigns. The combination of the technologies and expertise of Google and DoubleClick will help publishers better monetize their unsold inventory, thus helping to fuel the creation of even more rich and diverse content on the Internet.

What ad serving is
As you might expect, ad serving is the act of serving, or delivering, ads to websites. Google and DoubleClick play different but complementary roles in online advertising. Google primarily sells ads, and DoubleClick delivers (serves) ads. The relationship between Google and DoubleClick is analogous to the relationship between Amazon.com and Federal Express. Amazon.com makes money by selling a book to the consumer. Federal Express makes money by delivering it to the consumer.

For some perspective on the relative size of the ad serving business versus the online ad sales business, some industry estimates put the latter, globally, at about $20-30 billion. According to various eMarketer studies (available by subscription), estimates of ad serving, on the other hand, are many times smaller -- probably 20 times smaller, or even less.

How ad serving works
There are two types of ad-serving products: publisher and advertiser-agency. Publishers use ad-serving products to manage how and when the ads they have sold appear in their websites. For example, will the ad appear on the front page of the site, or on a subsequent page? The process of placing the ad on the appropriate page and in the appropriate size is managed by the publisher’s ad server.

In addition to placing ads in the right location at the right time, ad servers report on the performance of the ads. This is an absolutely vital function. Real-time performance reporting enables advertisers and agencies to change the content, and timing of ads almost on the fly. The value to the advertiser-agency of an ad-serving company such as DoubleClick is having a single place to measure and report on all online campaigns for ads that run on different sites across the web.

How Google and DoubleClick differ
Google makes money primarily by selling text-based ads to advertisers and their agencies. These are displayed on Google.com and partner sites through our AdSense program. We get paid when consumers click on the ads.

DoubleClick is in the ad-serving business and has two primary products. DART for Advertisers is an ad server that gives advertisers/agencies the tools to plan, deliver and report on their online ads. DART for Publishers gives publishers the tools to place ads on their site, optimize them, and assess placement to make the best use of their ad inventory. For the most part, DoubleClick is paid by advertisers and publishers to serve and report on ads. These are two vital and interrelated functions. Allowing agencies and advertisers to deliver ads in the right context and monitor their effectiveness maximizes the return on investment for a given ad or campaign. Ultimately, this leads to better and more relevant ads for the consumer.

Why we're buying DoubleClick
In summary, we're buying DoubleClick because:
  1. DoubleClick's products and technology are complementary to our search and and content-based text advertising business, and give us new opportunities to improve online advertising for consumers, advertisers and publishers.
  2. Historically, we've not allowed third parties to serve into Google's AdSense network, which has made it hard for advertisers to get performance metrics. Together, Google and DoubleClick can deliver a more open platform for advertisers, and provide the metrics they need to manage marketing campaigns.
  3. By combining Google's infrastructure with DoubleClick's knowledge of agencies and publishers, we can create the next generation of more innovative ad serving technology, one that significantly improves the efficiency and effectiveness of online advertising.
  4. To manage ad inventory, some of the largest publishers use DoubleClick DART for Publishers – but a good portion of it goes unsold. It's our view that the combination of DoubleClick and Google will help these publishers succeed by monetizing their unsold inventory.
We believe DoubleClick can help Google deliver better, more relevant display ads, which improves the online experience of consumers. From a technical perspective, Google will also be able to get web pages to load faster by reducing latency from ad servers. Publishers will benefit by making more money from remnant inventory and – as has been the case with other technologies we've acquired – we hope to make ad serving more accessible. Smaller publishers would get access to DoubleClick's ad serving technology, enabling them to better compete in the global marketplace.

Advertisers and agencies will benefit, too. AdSense will support certain ad tags so advertisers will be able to use a broader selection of formats in our ad network, improving ad relevance. And the experience for advertisers will be more efficient, because there will be an ad server that provides consolidated reporting and management of display ads on all properties and networks. More generally, we'll be able to use our technology and record of innovation to improve the quality of existing products in the marketplace. We intend to invest heavily in R&D and product development to respond to the demand from publishers, advertisers and agencies for better tools.

In short, Google’s acquisition of DoubleClick will benefit all parties in the online advertising business, including advertisers, publishers, agencies and, most importantly, consumers.

New dictionary translations



Google's automatic translation is handy for getting translations of complete sentences, paragraphs, and documents. But when you need to translate a single word, a bilingual dictionary can be very useful because it gives you translations for the many possible meanings a word might have. With that in mind, we've added dictionary translations to Google Translate. Now, for example, if you want to know how to say "play" in Spanish, you can use our dictionary translation and learn that depending on the context it can be "jugar", "tocar", or "obra", among others.

A smooth Apps move



Today, it becomes a lot easier for organizations and schools to start using Google Apps email services without leaving any of their valuable email data behind. Our new self-service mail migration tools enables administrators using the Premier and Education Editions to easily copy existing mail from an IMAP server over to Google Apps. Now businesses and schools can spend less time worrying about "maintaining infrastructure" and focus more on the things that matter most to them -- like healthcare or educating students.

One of the first organizations to test this out, Central Piedmont Community College, replaced its old email system for 30,000 users in just 3 weeks. And that process came down to 3 million emails flying from their server over to ours in just 24 hours -- more than 2,000 emails per minute, all without missing a beat.

Once you're part of the Google Apps family, you can be sure there are more exciting things to come. In the past month alone, we added five new improvements to make it even easier for organizations to share information and work together.