Like Yeah

UPDATE Cara Mendownload File Cara nya Cukup mudah yaitu klik tombol Skip Ad / Lewati dan Tunggu 5 Detik .
UPDATE How To Download File His manner was easy enough that click Skip Ad button / Skip and Wait 5 Seconds .
PENTING ! KILK TOMBOL LIKE DAN SUBCRIBE MY BLOOGER DAN FOLLOW MY FACEBOOK AND INSTAGRAM . TERMIAKASIH ANDA TELAH MENGUNJUNGI BLOGGER KECIL INI �� .
IMPORTANT! CLICK LIKE AND SUBSCRIBE MY BLOG AND FOLLOW MY FACEBOOK AND Instagram. THANK YOU FOR VISITING THIS LITTLE BLOGGER �� .
Please help me weeks to promote the blogger page to purchase a .com domain please click on the button below to donate to build a mini website thank you for your attention may god bless you. amen

Now there's Google Finance for Canada-update



Oops. We hit the button too soon. Watch for news about Google Finance in Canada next Tuesday.

Robots Exclusion Protocol: now with even more flexibility



This is the third and last in my series of blog posts about the Robots Exclusion Protocol (REP). In the first post, I introduced robots.txt and the robots META tags, giving an overview of when to use them. In the second post, I shared some examples of what you can do with the REP. Today, I'll introduce two new features that we have recently added to the protocol.

As a product manager, I'm always talking to content providers to learn about your needs for REP. We are constantly looking for ways to improve the control you have over how your content is indexed. These new features will give you flexible and convenient ways to improve the detailed control you have with Google.

Tell us if a page is going to expire
Sometimes you know in advance that a page is going to expire in the future. Maybe you have a temporary page that will be removed at the end of the month. Perhaps some pages are available free for a week, but after that you put them into an archive that users pay to access. In these cases, you want the page to show in Google search results until it expires, then have it removed: you don't want users getting frustrated when they find a page in the results but can't access it on your site.

We have introduced a new META tag that allows you to tell us when a page should be removed from the main Google web search results: the aptly named unavailable_after tag. This one follows a similar syntax to other REP META tags. For example, to specify that an HTML page should be removed from the search results after 3pm Eastern Standard Time on 25th August 2007, simply add the following tag to the first section of the page:

<META NAME="GOOGLEBOT" CONTENT="unavailable_after: 25-Aug-2007 15:00:00 EST">

The date and time is specified in the RFC 850 format.

This information is treated as a removal request: it will take about a day after the removal date passes for the page to disappear from the search results. We currently only support unavailable_after for Google web search results.

After the removal, the page stops showing in Google search results but it is not removed from our system. If you need a page to be excised from our systems completely, including any internal copies we might have, you should use the existing URL removal tool which you can read about on our Webmaster Central blog.

Meta tags everywhere
The REP META tags give you useful control over how each webpage on your site is indexed. But it only works for HTML pages. How can you control access to other types of documents, such as Adobe PDF files, video and audio files and other types? Well, now the same flexibility for specifying per-URL tags is available for all other files type.

We've extended our support for META tags so they can now be associated with any file. Simply add any supported META tag to a new X-Robots-Tag directive in the HTTP Header used to serve the file. Here are some illustrative examples:
  • Don't display a cache link or snippet for this item in the Google search results:
X-Robots-Tag: noarchive, nosnippet
  • Don't include this document in the Google search results:
X-Robots-Tag: noindex
  • Tell us that a document will be unavailable after 7th July 2007, 4:30pm GMT:
X-Robots-Tag: unavailable_after: 7 Jul 2007 16:30:00 GMT

You can combine multiple directives in the same document. For example:
  • Do not show a cached link for this document, and remove it from the index after 23rd July 2007, 3pm PST:
X-Robots-Tag: noarchive
X-Robots-Tag: unavailable_after: 23 Jul 2007 15:00:00 PST


Our goal for these features is to provide more flexibility for indexing and inclusion in Google's search results. We hope you enjoy using them.

Computer science resources for academics



Google has a long history of involvement with universities, and we're excited to share some recent news on that front with you. At the main Google campus this week we're hosting the Google Faculty Summit, which involves universities all over participating in discussions about what we're up to in research-land as well as computer science education - something very near and dear to us.

Meanwhile, because we know that between teaching, doing research and advising students, computer science educators are quite strapped for time, we've recently launched a site called Google Code for Educators. While you may have previously heard about our offerings for K-12 teachers, this new program is focused on CS topics at the university level, and lets us share the knowledge we've built up around things like distributed systems and AJAX programming. It's designed for university faculty to learn about new computer science topics and include them in their courses, as well as to help curious students learn on their own.

Right now, Google Code for Educators offers materials for AJAX web programming, distributed systems and parallel programming, and web security. The site includes slides, programming labs, problem sets, background tutorials and videos. We're eager to provide more content areas and also more iterations for existing topic areas. To allow for liberal reuse and remixing, most sample course content on Code EDU is available under a Creative Commons license. Please let us know your thoughts on this new site.

Beyond CS education, another important faculty topic is research. Google Research offers resources to CS researchers,including papers authored by Googlers and a wide variety of our tech talks. You might be interested in learning more about MapReduce and the Google File System, two pieces of Google-grown technology that have allowed us to operate at enormous scale. We also recently put together a few university research programs and we're eager to see what academics come up with.

What Eric Schmidt did this summer



In case you're thinking summer is the time to slow down, that's not always true around here. Our CEO has been on the go on behalf of a number of our public policy initiatives. And our Public Policy blog has been keeping up with him.

Earth to the Enterprise



With more than 200 million downloads and counting, Google Earth is known around the world. Less well-known is our Google Earth Enterprise which companies, organizations and government agencies use to view their global data and imagery. Experts and amateurs alike use it for everything from designing new buildings to exploring for energy to responding to emergencies, because Google Earth Enterprise offers access to geospatial info that was once limited to specialty applications.

For instance, check out Dell's implementation showing a geographic view of traffic to Dell.com:



Today, we're releasing the latest version, which makes it easy to publish and view Google Earth datasets in 2D using a browser. By accessing Google Earth Enterprise from a web browser, employees across an organization will benefit from the rich geographic tapestry. There's more detail on the Google Lat/Long blog.

Like making videos? Love Gmail?



A couple weeks back, some of us on the Gmail team were talking about how simple it's become to connect with people around the world through email. And we got to thinking: what if email was delivered via a Rube Goldberg machine, but instead of gears and levers, people on everything from bicycles to submarines brought messages from one place to another? So we had a little fun with a collaborative video depicting just that (well, not the submarines).

Now it's time to let everyone in on the action. Learn more at our new Gmail Blog, or go directly to http://mail.google.com/mvideo where you'll find directions on how to submit your clip.



Calling all SketchUp fans



It's my great pleasure to announce the launch of the Official Google SketchUp Blog. Fast-breaking news, tips and tricks, user stories and just the right amount of office intrigue await anyone who pays us a visit. Also, sexy mustache contests.

For those of you who have no idea what SketchUp is, I'll start at the beginning: The world is three-dimensional. Designing a house, building a piece of furniture and navigating through a city all involve three-dimensional decisions. SketchUp is 3D modeling software that anyone can use to build models of whatever they like.

Check out the 3D Warehouse to see models from people all over the world, and turn on the 3D Warehouse layer in Google Earth to explore cities with realistic 3D buildings made in SketchUp (Denver is particularly impressive). If you like, you can download the free version and start building models yourself.

Our commitment to open broadband platforms



For several years now, many Googlers have been working to identify the obstacles that prevent the Internet from being available to everyone on the planet. It strikes us as unfair that some people should enjoy such abundant access to this rich resource while billions of others aren't so lucky. Though the technology exists today to provide access on a global scale, often we have learned technology isn't the problem. In this context, we have worked hard to advance a set of principles that will make Internet access for all a priority.

For instance, we wrote last week on our Public Policy Blog about Google's interest in promoting competition in the broadband market here in the U.S., to help ensure that as many Americans as possible can access the Internet. However, it takes more than just ideas and rhetoric if you want to help bring the Internet to everyone.

So today, we're putting consumers' interests first, and putting our money where our principles are -- to the tune of $4.6 billion. Let me explain.

In the U.S., wireless spectrum for mobile phones and data is controlled by a small group of companies, leaving consumers with very few service providers from which to choose. With that in mind, last week, as the federal government prepares for what is arguably its most significant auction of wireless spectrum in history, we urged the Federal Communications Commission (FCC) to adopt rules to make sure that regardless of who wins the spectrum at auction, consumers' interests are the top priority. Specifically, we encouraged the FCC to require the adoption of four types of "open" platforms as part of the auction:
  • Open applications: consumers should be able to download and utilize any software applications, content, or services they desire;
  • Open devices: consumers should be able to utilize their handheld communications device with whatever wireless network they prefer;
  • Open services: third parties (resellers) should be able to acquire wireless services from a 700 MHz licensee on a wholesale basis, based on reasonably nondiscriminatory commercial terms; and
  • Open networks: third parties (like Internet service providers) should be able to interconnect at any technically feasible point in a 700 MHz licensee's wireless network.
As numerous public interest organizations noted earlier this week, all four of these conditions adopted together would promote a spirit of openness, and could spur additional forms of competition from web-based entities, such as software applications providers, content providers, handset makers, and ISPs. The big winners? Consumers. As choices increase, prices come down and more Americans have access to the Net.

The FCC is currently considering draft rules for the auction, and the reports we've heard are that those rules include some -- but not all four -- of the openness conditions that we and consumer groups support. While any embrace of open platforms is welcome, only if the FCC adopts all four principles will we see the genuinely competitive marketplace that Americans deserve. In particular, guaranteeing open services and open networks would ensure that entrepreneurs starting new networks and services will have a fair shot at success, in turn giving consumers a wider choice of broadband providers.

There are some who have claimed that embracing these principles and putting American consumers first might somehow devalue this spectrum. As much as we don't believe this to be the case, actions speak louder than words. That's why our CEO Eric Schmidt today sent a letter to FCC Chairman Kevin Martin, saying that, should the FCC adopt all four license conditions requested above, Google intends to commit at least $4.6 billion to bidding for spectrum in the upcoming 700 Mhz auction.

Why $4.6 billion? While we think that a robust and competitive auction based on these four principles will likely produce much higher bids, and we are eager to see a diverse set of bidders competing, $4.6 billion is the reserve price that FCC has proposed for the auction. With any concerns about revenue to the U.S. Treasury being satisfied, we hope the FCC can return its attention to adopting openness principles for the benefit of consumers.

In the meantime, thank you to those who have reached out to help with our efforts. It feels good to see how many of you support true competition for the benefit of consumers and we look forward to hearing from even more of you in the days to come.

For now, and for all of us, the issue is simple: this is one of the best opportunities we will have to bring the Internet to all Americans. Let's seize that opportunity.

Note: We've cross-posted this to our Public Policy Blog.

Your Campus in 3D winners announced



The results are in for the winners of the Build Your Campus in 3D Competition, which we announced in January. The judges chose 7 teams from among the dozens who submitted more than 4,000 buildings from colleges and universities all over North America. And the winning school teams who will be joining us in Mountain View are:

University of Minnesota | Twin Cities, Minnesota
Purdue University | West Lafayette, Indiana
Concordia University, Montreal, Quebec
Indiana University - Purdue University Fort Wayne | Fort Wayne, Indiana
Franklin W. Olin College of Engineering | Needham, Massachusetts
Dartmouth College | Hanover, New Hampshire
Stanford University | Stanford, California

Check out the competition site to see more details about the judges, the rules, the winners, and what they won. From there, you can follow a link to see the winning campuses in your copy of Google Earth. Again, congrats to the winning teams, and a big thank you to everyone who participated.

Opening up Google Print Ads



Even with the growth of online news sites, Americans still read newspapers. Over the course of a typical week, nearly 3 out of 4 adults (115 million) in the top 50 markets read a copy of a daily or Sunday newspaper.* That's why thousands of businesses use print advertising every day to reach a local audience, and why we've announced that we're extending Google AdWords to newspapers for most U.S. advertisers. To learn more, visit the Google Print Ads™ site, or read about it on the Inside AdWords blog.

*Scarborough Research USA, Release 2, 2006.

Hosted site search for businesses

Nitin Mangtani, Product Manager, Enterprise Search and Rajat Mukherjee, Group Product Manager, Search

Businesses spend a lot of effort and energy creating and promoting great websites for their products and services, but quality search is often missing. As a result, businesspeople often ask us why they can't use Google to power search on their sites.

Today we've released Custom Search Business Edition (CSBE) to do just that. CSBE is a hosted site search solution that provides Google-quality results for your website. It's fast, relevant, reliable, and flexible, so that users can quickly find what they're looking for through search results customized and integrated into your business website.

CSBE builds on the Google Custom Search Engine, a hosted search solution we introduced last October that allows organizations to create a search engine and search results that are tailored to their point of view. All well and good, but businesses have asked us for greater flexibility and support -- and we're addressing these needs with CSBE. Businesses that want further control over results presentation and integration with their website can obtain results through XML. Now those of you with business sites have the option to turn off ads and have further control over branding. In addition, CSBE provides options for email and phone support. The pricing starts at $100 per year for searching up to 5,000 pages.

This offering should be a great help to the millions of businesses that have a web presence but don't offer users any way to search the site. Instead of being left on their own to navigate content, visitors to CSBE-enabled sites will be able to navigate through search results without ever leaving the site. We hope an improved customer search experience will translate into more referrals, more opportunities for e-commerce, and more satisfied online customers for these businesses.Here's more about CSBE.

Automating web application security testing



Cross-site scripting (aka XSS) is the term used to describe a class of security vulnerabilities in web applications. An attacker can inject malicious scripts to perform unauthorized actions in the context of the victim's web session. Any web application that serves documents that include data from untrusted sources could be vulnerable to XSS if the untrusted data is not appropriately sanitized. A web application that is vulnerable to XSS can be exploited in two major ways:

    Stored XSS - Commonly exploited in a web application where one user enters information that's viewed by another user. An attacker can inject malicious scripts that are executed in the context of the victim's session. The exploit is triggered when a victim visits the website at some point in the future, such as through improperly sanitized blog comments and guestbook entries, which facilitates stored XSS.

    Reflected XSS - An application that echoes improperly sanitized user input received as query parameters is vulnerable to reflected XSS. With a vulnerable application, an attacker can craft a malicious URL and send it to the victim via email or any other mode of communication. When the victim visits the tampered link, the page is loaded along with the injected script that is executed in the context of the victim's session.

The general principle behind preventing XSS is the proper sanitization (via, for instance, escaping or filtering) of all untrusted data that is output by a web application. If untrusted data is output within an HTML document, the appropriate sanitization depends on the specific context in which the data is inserted into the HTML document. The context could be in the regular HTML body, tag attributes, URL attributes, URL query string attributes, style attributes, inside JavaScript, HTTP response headers, etc.

The following are some (by no means complete) examples of XSS vulnerabilities. Let's assume there is a web application that accepts user input as the 'q' parameter. Untrusted data coming from the attacker is marked in red.

  • Injection in regular HTML body - angled brackets not filtered or escaped

    <b>Your query '<script>evil_script()</script>' returned xxx results</b>

  • Injection inside tag attributes - double quote not filtered or escaped

    <form ...
      <input name="q" value="blah"><script>evil_script()</script>">
    </form>

  • Injection inside URL attributes - non-http(s) URL

    <img src="javascript:evil_script()">...</img>

  • In JavaScript context - single quote not filtered or escaped

    <script>
      var msg = 'blah'; evil_script(); //';
      // do something with msg variable
    </script>


In the cases where XSS arises from meta characters being inserted from untrusted sources into an HTML document, the issue can be avoided either by filtering/disallowing the meta characters, or by escaping them appropriately for the given HTML context. For example, the HTML meta characters <, >, &, " and ' must be replaced with their corresponding HTML entity references &lt;, &gt;, &amp;, &quot; and &#39 respectively. In a JavaScript-literal context, inserting a backslash in front of \, ', " and converting the carriage returns, line-feeds and tabs into \r, \n and \t respectively should avoid untrusted meta characters being interpreted as code.

How about an automated tool for finding XSS problems in web applications? Our security team has been developing a black box fuzzing tool called Lemon (deriving from the commonly-recognized name for a defective product). Fuzz testing (also referred to as fault-injection testing) is an automated testing approach based on supplying inputs that are designed to trigger and expose flaws in the application. Our vulnerability testing tool enumerates a web application's URLs and corresponding input parameters. It then iteratively supplies fault strings designed to expose XSS and other vulnerabilities to each input, and analyzes the resulting responses for evidence of such vulnerabilities. Although it started out as an experimental tool, it has proved to be quite effective in finding XSS problems. Besides XSS, it finds other security problems such as response splitting attacks, cookie poisoning problems, stacktrace leaks, encoding issues and charset bugs. Since the tool is homegrown it is easy to integrate into our automated test environment and to extend based on specific needs. We are constantly in the process of adding new attack vectors to improve the tool against known security problems.

Update:
I wanted to respond to a few questions that seem to be common among readers. I've listed them below. Thanks for the feedback. Please keep the questions and comments coming.

Q. Does Google plan to market it at some point?
A. Lemon is highly customized for Google apps and we have no plans of releasing it in near future.

Q. Did Google's security team check out any commercially available fuzzers? Is the ability to keep improving the fuzzer the main draw of a homegrown tool?
A. We did evaluate commercially available fuzzers but felt that our specialized needs could be served best by developing our own tools.

Cookies: expiring sooner to improve privacy



We are committed to an ongoing process to improve our privacy practices, and have recently taken a closer look at the question of cookie privacy. How long should a web site "remember" cookie information in its logs after a user's visit? And when should a cookie expire on your computer? Cookie privacy is both a server and a client issue.

On the server side, we recently announced that we will anonymize our search server logs — including IP addresses and cookie ID numbers — after 18 months.

Now, we're asking the question about cookie lifetime: when should a cookie expire on your computer? For background: a cookie is a very small file which gets stored on your computer All search engines and most websites use cookies. Why? Cookies remind us of your preferences from the last time you visited our site. For example, Google uses our so-called "PREF cookie" to remember our users’ basic preferences, such as the fact that a user wants search results in English, no more than 10 results on a given page, or a SafeSearch setting to filter out explicit sexual content. When we originally designed the PREF cookie, we set the expiration far into the future — in 2038, to be exact — because the primary purpose of the cookie was to preserve preferences, not to let them be forgotten. We were mindful of the fact that users can always go to their browsers to change their cookie management settings, e.g. to delete all cookies, delete specific cookies, or accept certain types of cookies (like first-party cookies) but reject others (like third-party cookies).

After listening to feedback from our users and from privacy advocates, we've concluded that it would be a good thing for privacy to significantly shorten the lifetime of our cookies — as long as we could find a way to do so without artificially forcing users to re-enter their basic preferences at arbitrary points in time. And this is why we’re announcing a new cookie policy.

In the coming months, Google will start issuing our users cookies that will be set to auto-expire after 2 years, while auto-renewing the cookies of active users during this time period. In other words, users who do not return to Google will have their cookies auto-expire after 2 years. Regular Google users will have their cookies auto-renew, so that their preferences are not lost. And, as always, all users will still be able to control their cookies at any time via their browsers.

Together, these steps — logs anonymization and cookie lifetime reduction — are part of our ongoing plan to continue innovating in the area of privacy to protect our users.

Nonprofits mix it up with Google Apps

Search your blog world

Rajat Mukherjee, Group Product Manager

What happens when you put together a popular blogging platform, a customizable search experience and a flexible search API? You get a Search Box widget for Blogger, built using the AJAX Search API, and powered by a Linked Custom Search Engine (CSE).

Configure this widget on your Blogger blog and you can immediately search not just your blog posts, but across all the link lists/blogrolls you've set up on your blog and the links you've made from your posts.

The widget is now available on Blogger in Draft, Blogger's experimental site. Once you've logged in and configured the widget, visitors to your blog will see a search box there. The search experience inherits your blog's look and feel, and is uniquely flavored around pages you've linked to from your blog.

To add the widget:
  1. Edit your blog's layout.
  2. Click on "Add a page element" and configure the "Search Box" widget.
Your link lists will automatically show up as optional tabs for your search; you can decide which ones you want to configure. Go ahead -- custom-search-enable your blog!

Update: Re-posted with copy written for this blog.

Overview of our accessible services



From time to time, our own T.V. Raman shares his tips on how to use Google from his perspective as a technologist who cannot see -- tips that sighted people, among others, may also find useful. - Ed.

We provide a wide variety of services that are mostly accessed with a web browser. People visit Google from a large number of browsers and platforms; in addition, we also understand that every user is special and may have special needs. Accessibility at Google is about making sure that our services work well for all our users -- independent of your needs and abilities at any given time.

Web search, our primary service, has a very simple interface and has always been accessible from a variety of user environments. Newer services that present highly interactive interfaces continue to present accessibility challenges when used with specialized adaptive technologies such as screenreaders. We are committed to finding accessibility solutions that make our services work better for everyone who visits.

Here's a list of our accessibility-related services and a few solutions to some accessibility challenges.

  • Web Search: Result pages include headers to delineate logical sections.
  • Accessible Search: Promotes results that are accessible.
  • Book Search: Full-text access to public-domain works.
  • Gmail: A simple yet functional HTML mode that works well with screenreaders.
  • Gmail Mobile: A lightweight user interface that is also speech-friendly.
  • Google Maps: Easy-to-use textual directions.
  • Calendar: A functional, yet speech-friendly user interface.
  • Audio Captchas: All services that use Google Accounts provide an audio alternative for the visual challenge-response tests that are used to distinguish humans from machines.
  • Mobile Transcoder: A mobile lens for viewing the web that produces accessible views.
  • Google Video: Allows uploaded videos to contain captions/subtitles in multiple languages for viewers who are hearing-impaired or unfamiliar with the original language.
  • Google Talk: IM clients inside a web browser can pose accessibility challenges, but the use of the open Jabber API means that Google users can choose from a variety of Jabber clients, many of which work well with adaptive technologies.
  • Web APIs: Many Google services offer high-level web APIs that aid in authoring mashups; this provides a means for creating highly customized accessible views.
  • 1-800-GOOG-411: Here's an exception to the rule that we deliver most things through a web browser. Our experimental Voice Local Search service lets anyone who can speak into a phone search for a local business by name or category; get connected to the business free of charge; get the details by SMS if you’re using a mobile phone. (Just say "text message".)
Finally, many Google services such as Google Scholar, Google News, Blogger and Google Product Search work out of the box. While today's screenreaders can hit some bumps on the road when using more advanced features in these products, these web interfaces degrade gracefully to provide a functional interface.

If any of this interests you, we invite you to participate in our user community. Please tell us what works well, share your own tips on using Google services, and make sure to tell us what could be made even better.

Update: Added info on 1-800-GOOG-411.


Bloggin' down under



To our international friends, Australia can have a rep for Crocodile Dundee jokes, poisonous animals and meat pies with mushy peas. What you'll find when you visit the new (ish) Google Australia Blog are the real reasons why Google has made a significant investment in Australia. And right now there's a post explaining the special doodle that was on the Google Australia homepage for NAIDOC Week.

Both an engineering and sales support hub, our Aussie team is an eclectic mix of people from all walks of life. We're lucky enough to be the home of the original Google Maps team, so we're at the forefront of global product releases. Aussie Googlers don't take ourselves too seriously, we love a good laugh, we're always happy to make fools of ourselves for a good cause (so long as we're beating the Poms in the Ashes).

Visit our blog to read about new product launches for our Australian users, musings on life as an Aussie Googler and what the team gets up to in the community. (Also, stop by if you want to see words like "centre", "maximise" and "humour" spelt correctly). We hope to see you there!

The reason behind the "We're sorry..." message




Some of you might have seen this message while searching on Google, and wondered what the reason behind it might be. Instead of search results, Google displays the "We're sorry" message when we detect anomalous queries from your network. As a regular user, it is possible to answer a CAPTCHA - a reverse Turing test meant to establish that we are talking to a human user - and to continue searching. However, automated processes such as worms would have a much harder time solving the CAPTCHA. Several things can trigger the sorry message. Often it's due to infected computers or DSL routers that proxy search traffic through your network - this may be at home or even at a workplace where one or more computers might be infected. Overly aggressive SEO ranking tools may trigger this message, too. In other cases, we have seen self-propagating worms that use Google search to identify vulnerable web servers on the Internet and then exploit them. The exploited systems in turn then search Google for more vulnerable web servers and so on.  This can lead to a noticeable increase in search queries and sorry is one of our mechanisms to deal with this.

At ACM WORM 2006, we published a paper on Search Worms [PDF] that takes a much closer look at this phenomenon. Santy, one of the search worms we analyzed, looks for remote-execution vulnerabilities in the popular phpBB2 web application. In addition to exhibiting worm like propagation patterns, Santy also installs a botnet client as a payload that connects the compromised web server to an IRC channel. Adversaries can then remotely control the compromised web servers and use them for DDoS attacks, spam or phishing. Over time, the adversaries have realized that even though a botnet consisting of web servers provides a lot of aggregate bandwidth, they can increase leverage by changing the content on the compromised web servers to infect visitors and in turn join the computers of compromised visitors into much larger botnets. This fundamental change from remote attack to client based download of malware formed the basis of the research presented in our first post. In retrospect, it is interesting to see how two seemingly unrelated problems are tightly connected.

Welcome, Postini team



We launched Google Apps so that it would be easier for employees to communicate and share information while reducing the hassles and costs associated with enterprise software. Companies are responding: every day, more than 1,000 small businesses sign up for Google Apps.

Larger enterprises, however, face a challenge: though they want to deliver simple, useful hosted applications to their employees, they're also required to support complex business rules, information security mandates, and an array of legal and corporate compliance issues. In effect, many businesses use legacy systems not because they are the best for their users, but because they are able to support complex business rules. This isn't a tradeoff that any business should have to make.

We realized that we needed a more complete way to address these information security and compliance issues in order to better support the enterprise community. That's why we're excited to share the news that we've agreed to acquire Postini, a company that offers security and corporate compliance solutions for email, IM, and other web-based communications. Like Google Apps, Postini's services are entirely hosted, eliminating the need to install any hardware or software. A leader in its field, Postini serves more than 35,000 businesses and 10 million users, and was one of our first partners for Google Apps. Their email and IM management services include inbound and outbound policy management, spam and virus protection, content filtering, message archiving, encryption, and more. We will continue to support Postini's customers and we look forward to the possibilities ahead.

Here's the press release announcing the deal, and there's more detail in our FAQ and on the Enterprise blog.

Ever more books to read



From time to time, our own T.V. Raman shares his tips on how to use Google from his perspective as a technologist who cannot see -- tips that sighted people, among others, may also find useful.

As a book lover, I couldn't have been more excited by the advent of electronic books in the early 90s. And with the rise of the Internet, the possibility of being able to discover books online was really exciting.

I work on a project at Google called Google Accessible Search, which helps promote results that are more accessible to visually impaired users. Building on that work is today's release of accessible public domain works through Google Book Search. It's opening up hundreds of thousands of books to people who use adaptive technologies such as speech output, screen readers, and Braille displays.

I'm extremely enthusiastic about many of our efforts at Google, which ultimately have the potential to make the experience of accessing information for visually impaired users just as easy as it is for those with sight. My reading used to be determined by what was available in a form I could read. But today it's a question of using Google effectively so that I can find the right thing to read. Today's Book Search launch is an excellent example of how technology is truly democratizing access to the world's information.

If you have feedback about Google's accessibility services, join our group to share tips on what works well and what could be made better.

All aboard



We're pleased to announce that we have acquired GrandCentral Communications, a company that provides services for managing your voice communications. GrandCentral is an innovative service that lets users integrate all of their existing phone numbers and voice mailboxes into one account, which can be accessed from the web. We think GrandCentral's technology fits well into Google's efforts to provide services that enhance the collaborative exchange of information between our users.

GrandCentral offers many features that complement the phone services you already use. If you have multiple phone numbers (e.g., home, work, cell), you get one phone number that you can set to ring all, some, or none of your phones, based on who's calling. This way, your phone number is tied to you, and not your location or job. The service also gives you one central voice mailbox. You can listen to your voicemails online or from any phone, forward them to anybody, add the caller to your address book, block a caller as spam, and a lot more. You can even listen in on voicemail messages from your phone while they are being recorded, or switch a call from your cell phone to your desk phone and back again. All in all, you'll have a lot more control over your phones.

We're really excited to welcome the GrandCentral team to Google. While we're moving their technology over to Google's network, a limited number of invitations will be available to register for a GrandCentral beta account. If you have a U.S. telephone number, you can sign up for an invitation at www.grandcentral.com. Current GrandCentral customers will continue to have uninterrupted access to the service.

Google and health care



In a world of 24/7 news cycles, a summer weekend can bring considerable -- and unanticipated -- excitement. Take for example the reaction we've just seen to an item on our new health advertising blog. Frankly, we were surprised by the pickup, but perhaps we shouldn't have been. We've been proponents of corporate blogging for some time, despite the significant communication challenges that obviously arise from having many voices from all parts of our company speak publicly through blog posts. In this case, the blog criticized Michael Moore's new film "Sicko" to suggest how health care companies might use our ad programs when they face controversy. Our internal review of the piece before publication failed to recognize that readers would -- properly, but incorrectly -- impute the criticisms as reflecting Google's official position. We blew it.

In fact, Google does share many of the concerns that Mr. Moore expresses about the cost and availability of health care in America. Indeed, we think these issues are sufficiently important that we invited our employees to attend his film (nearly 1,000 people did so). We believe that it will fall to many entities -- businesses, government, educational institutions, individuals -- to work together to solve the current system's shortcomings. This is one reason we're deploying our technology and our expertise with the hope of improving health system information for everyone who is or will become a patient. Over the last several months, we have been blogging about our thinking in this area. See: November 30, 2006, March 28, May 23, and June 14, 2007.

In the meantime, we have taken steps on our own to address the failures we see in our health care system. In our case, the menu of health care options that we offer our employees includes both direct services (for example, on-site medical and dental professionals in certain locations) as well as a range of preventive care programs. It's one of the ways we're attempting to demonstrate corporate responsibility on a major issue of our time.