SEI’s Solar energy training impacts projects around the world

Lessons learned at Solar Energy International (SEI) can be life-changing for students who take our solar energy training both online and in-person at our campus in Paonia, Colorado. In the case of SEI alum, Estevan Figueroa, the impact of SEI classes will spread across the world as he takes what he learned in PV301L: Solar Electric Lab Week (Battery-Based) to Uganda.

Estevan traveled to SEI’s campus in Paonia, Colorado in October for solar energy training in battery-based Photovoltaics (PV). Estevan works with a non-profit, Africa Development Promise, which, “moves women farmers from food for subsistence to food for business using the cooperative model of enterprise. To promote economic empowerment, [they] have adopted a multi-faceted approach to address the concerns of women farmers.  Chief among them is access to sustainable and affordable water and electricity.”

Formerly involved in the oil and gas industry, Estevan accrued years of OSHA (Occupational Safety and Health Administration) training. He got his start in non-profits around 8 years ago as an event coordinator. It was a combination of his OSHA training, non-profit experience and interest in solar that poised him to join the Africa Development Promise team in their mission to install ‘solar kiosks’ in addition to other initiatives in the Wakiso District in Uganda.

In January, Estevan will travel to Uganda for two years to act as a project coordinator and safety inspector for Africa Development Promise. Estevan is currently interning with Utah-based Beautifi Solar, an installation company donating equipment and overseeing the initial design of PV that will go on the ‘solar kiosks.’

According to Estevan, kiosks will be “education stations” for the community. Basic structures in the region have already been build at cooperatives established by Africa Development Promise. At each cooperative there is a greenhouse with a DC (direct-current) water pumping application. The kiosks will add a power station for people to charge their house batteries, used for emergencies, that typically take a costly commute of 12-20 miles to charge. Kiosks will include wifi and a hub for renting tools. There will be a designated person renting out the tools, trained in OSHA standards by Estevan, who will be able to provide guidance. “The next step is teaching about how to maintain the PV systems next year or the year after that,” Estevan said.

Additionally, Estevan shared one more aspect of the solar kiosks, “One of the other things that I think is really cool about these systems is not only will we have the tools and the training and the means of powering the batteries,” Estevan said, “but I love an ice, cold beverage. Something that’s so minute for us here, but in the villages where we will be working, some of the people have never had the opportunity to have a cold beverage. So we’re going to have a fridge there when you come to charge your battery or rent some tools, or if you’re just using the internet for a while, you can get one free beverage.”

Aligning with Africa Development Promise’s mission, the kiosks as a source of wifi, will also give the opportunity for the village to market and sell their agricultural products with the help of the internet.

When Estevan leaves at the beginning of January for Uganda, with the structures already in place, he will be able to start installing and commissioning systems immediately. Estevan credits SEI with the achievement of solar knowledge which he will bring on his journey.

“All of the instructors have great passion, great energy and great real world examples of how it’s applicable and I had a blast,” Estevan said. “ This has been amazing, it’s been a lot, a lot of fun.”

To learn more about renewable energy applications for the developing world, check out SEI’s RDOL101: Appropriate Technology for the Developing World.

The post SEI’s Solar energy training impacts projects around the world appeared first on Solar Training – Solar Installer Training – Solar PV Installation Training – Solar Energy Courses – Renewable Energy Education – NABCEP – Solar Energy International (SEI).

Advertisements

SEI’s Solar energy training impacts projects around the world

Lessons learned at Solar Energy International (SEI) can be life-changing for students who take our solar energy training both online and in-person at our campus in Paonia, Colorado. In the case of SEI alum, Estevan Figueroa, the impact of SEI classes will spread across the world as he takes what he learned in PV301L: Solar Electric Lab Week (Battery-Based) to Uganda.

Estevan traveled to SEI’s campus in Paonia, Colorado in October for solar energy training in battery-based Photovoltaics (PV). Estevan works with a non-profit, Africa Development Promise, which, “moves women farmers from food for subsistence to food for business using the cooperative model of enterprise. To promote economic empowerment, [they] have adopted a multi-faceted approach to address the concerns of women farmers.  Chief among them is access to sustainable and affordable water and electricity.”

Formerly involved in the oil and gas industry, Estevan accrued years of OSHA (Occupational Safety and Health Administration) training. He got his start in non-profits around 8 years ago as an event coordinator. It was a combination of his OSHA training, non-profit experience and interest in solar that poised him to join the Africa Development Promise team in their mission to install ‘solar kiosks’ in addition to other initiatives in the Wakiso District in Uganda.

In January, Estevan will travel to Uganda for two years to act as a project coordinator and safety inspector for Africa Development Promise. Estevan is currently interning with Utah-based Beautifi Solar, an installation company donating equipment and overseeing the initial design of PV that will go on the ‘solar kiosks.’

According to Estevan, kiosks will be “education stations” for the community. Basic structures in the region have already been build at cooperatives established by Africa Development Promise. At each cooperative there is a greenhouse with a DC (direct-current) water pumping application. The kiosks will add a power station for people to charge their house batteries, used for emergencies, that typically take a costly commute of 12-20 miles to charge. Kiosks will include wifi and a hub for renting tools. There will be a designated person renting out the tools, trained in OSHA standards by Estevan, who will be able to provide guidance. “The next step is teaching about how to maintain the PV systems next year or the year after that,” Estevan said.

Additionally, Estevan shared one more aspect of the solar kiosks, “One of the other things that I think is really cool about these systems is not only will we have the tools and the training and the means of powering the batteries,” Estevan said, “but I love an ice, cold beverage. Something that’s so minute for us here, but in the villages where we will be working, some of the people have never had the opportunity to have a cold beverage. So we’re going to have a fridge there when you come to charge your battery or rent some tools, or if you’re just using the internet for a while, you can get one free beverage.”

Aligning with Africa Development Promise’s mission, the kiosks as a source of wifi, will also give the opportunity for the village to market and sell their agricultural products with the help of the internet.

When Estevan leaves at the beginning of January for Uganda, with the structures already in place, he will be able to start installing and commissioning systems immediately. Estevan credits SEI with the achievement of solar knowledge which he will bring on his journey.

“All of the instructors have great passion, great energy and great real world examples of how it’s applicable and I had a blast,” Estevan said. “ This has been amazing, it’s been a lot, a lot of fun.”

To learn more about renewable energy applications for the developing world, check out SEI’s RDOL101: Appropriate Technology for the Developing World.

The post SEI’s Solar energy training impacts projects around the world appeared first on Solar Training – Solar Installer Training – Solar PV Installation Training – Solar Energy Courses – Renewable Energy Education – NABCEP – Solar Energy International (SEI).

SEI’s Solar energy training impacts projects around the world

Lessons learned at Solar Energy International (SEI) can be life-changing for students who take our solar energy training both online and in-person at our campus in Paonia, Colorado. In the case of SEI alum, Estevan Figueroa, the impact of SEI classes will spread across the world as he takes what he learned in PV301L: Solar Electric Lab Week (Battery-Based) to Uganda.

Estevan traveled to SEI’s campus in Paonia, Colorado in October for solar energy training in battery-based Photovoltaics (PV). Estevan works with a non-profit, Africa Development Promise, which, “moves women farmers from food for subsistence to food for business using the cooperative model of enterprise. To promote economic empowerment, [they] have adopted a multi-faceted approach to address the concerns of women farmers.  Chief among them is access to sustainable and affordable water and electricity.”

Formerly involved in the oil and gas industry, Estevan accrued years of OSHA (Occupational Safety and Health Administration) training. He got his start in non-profits around 8 years ago as an event coordinator. It was a combination of his OSHA training, non-profit experience and interest in solar that poised him to join the Africa Development Promise team in their mission to install ‘solar kiosks’ in addition to other initiatives in the Wakiso District in Uganda.

In January, Estevan will travel to Uganda for two years to act as a project coordinator and safety inspector for Africa Development Promise. Estevan is currently interning with Utah-based Beautifi Solar, an installation company donating equipment and overseeing the initial design of PV that will go on the ‘solar kiosks.’

According to Estevan, kiosks will be “education stations” for the community. Basic structures in the region have already been build at cooperatives established by Africa Development Promise. At each cooperative there is a greenhouse with a DC (direct-current) water pumping application. The kiosks will add a power station for people to charge their house batteries, used for emergencies, that typically take a costly commute of 12-20 miles to charge. Kiosks will include wifi and a hub for renting tools. There will be a designated person renting out the tools, trained in OSHA standards by Estevan, who will be able to provide guidance. “The next step is teaching about how to maintain the PV systems next year or the year after that,” Estevan said.

Additionally, Estevan shared one more aspect of the solar kiosks, “One of the other things that I think is really cool about these systems is not only will we have the tools and the training and the means of powering the batteries,” Estevan said, “but I love an ice, cold beverage. Something that’s so minute for us here, but in the villages where we will be working, some of the people have never had the opportunity to have a cold beverage. So we’re going to have a fridge there when you come to charge your battery or rent some tools, or if you’re just using the internet for a while, you can get one free beverage.”

Aligning with Africa Development Promise’s mission, the kiosks as a source of wifi, will also give the opportunity for the village to market and sell their agricultural products with the help of the internet.

When Estevan leaves at the beginning of January for Uganda, with the structures already in place, he will be able to start installing and commissioning systems immediately. Estevan credits SEI with the achievement of solar knowledge which he will bring on his journey.

“All of the instructors have great passion, great energy and great real world examples of how it’s applicable and I had a blast,” Estevan said. “ This has been amazing, it’s been a lot, a lot of fun.”

To learn more about renewable energy applications for the developing world, check out SEI’s RDOL101: Appropriate Technology for the Developing World.

The post SEI’s Solar energy training impacts projects around the world appeared first on Solar Training – Solar Installer Training – Solar PV Installation Training – Solar Energy Courses – Renewable Energy Education – NABCEP – Solar Energy International (SEI).

from Raymond Castleberry Blog http://raymondcastleberry.blogspot.com/2017/10/seis-solar-energy-training-impacts.html
via http://raymondcastleberry.blogspot.com

Google Shares Details About the Technology Behind Googlebot

Posted by goralewicz

Crawling and indexing has been a hot topic over the last few years. As soon as Google launched Google Panda, people rushed to their server logs and crawling stats and began fixing their index bloat. All those problems didn’t exist in the “SEO = backlinks” era from a few years ago. With this exponential growth of technical SEO, we need to get more and more technical. That being said, we still don’t know how exactly Google crawls our websites. Many SEOs still can’t tell the difference between crawling and indexing.

The biggest problem, though, is that when we want to troubleshoot indexing problems, the only tool in our arsenal is Google Search Console and the Fetch and Render tool. Once your website includes more than HTML and CSS, there’s a lot of guesswork into how your content will be indexed by Google. This approach is risky, expensive, and can fail multiple times. Even when you discover the pieces of your website that weren’t indexed properly, it’s extremely difficult to get to the bottom of the problem and find the fragments of code responsible for the indexing problems.

Fortunately, this is about to change. Recently, Ilya Grigorik from Google shared one of the most valuable insights into how crawlers work:

Interestingly, this tweet didn’t get nearly as much attention as I would expect.

So what does Ilya’s revelation in this tweet mean for SEOs?

Knowing that Chrome 41 is the technology behind the Web Rendering Service is a game-changer. Before this announcement, our only solution was to use Fetch and Render in Google Search Console to see our page rendered by the Website Rendering Service (WRS). This means we can troubleshoot technical problems that would otherwise have required experimenting and creating staging environments. Now, all you need to do is download and install Chrome 41 to see how your website loads in the browser. That’s it.

You can check the features and capabilities that Chrome 41 supports by visiting Caniuse.com or Chromestatus.com (Googlebot should support similar features). These two websites make a developer’s life much easier.

Even though we don’t know exactly which version Ilya had in mind, we can find Chrome’s version used by the WRS by looking at the server logs. It’s Chrome 41.0.2272.118.

It will be updated sometime in the future

Chrome 41 was created two years ago (in 2015), so it’s far removed from the current version of the browser. However, as Ilya Grigorik said, an update is coming:

I was lucky enough to get Ilya Grigorik to read this article before it was published, and he provided a ton of valuable feedback on this topic. He mentioned that they are hoping to have the WRS updated by 2018. Fingers crossed!

Google uses Chrome 41 for rendering. What does that mean?

We now have some interesting information about how Google renders websites. But what does that mean, practically, for site developers and their clients? Does this mean we can now ignore server-side rendering and deploy client-rendered, JavaScript-rich websites?

Not so fast. Here is what Ilya Grigorik had to say in response to this question:

We now know WRS’ capabilities for rendering JavaScript and how to debug them. However, remember that not all crawlers support Javascript crawling, etc. Also, as of today, JavaScript crawling is only supported by Google and Ask (Ask is most likely powered by Google). Even if you don’t care about social media or search engines other than Google, one more thing to remember is that even with Chrome 41, not all JavaScript frameworks can be indexed by Google (read more about JavaScript frameworks crawling and indexing). This lets us troubleshoot and better diagnose problems.

Don’t get your hopes up

All that said, there are a few reasons to keep your excitement at bay.

Remember that version 41 of Chrome is over two years old. It may not work very well with modern JavaScript frameworks. To test it yourself, open http://jsseo.expert/polymer/ using Chrome 41, and then open it in any up-to-date browser you are using.

The page in Chrome 41 looks like this:

The content parsed by Polymer is invisible (meaning it wasn’t processed correctly). This is also a perfect example for troubleshooting potential indexing issues. The problem you’re seeing above can be solved if diagnosed properly. Let me quote Ilya:

“If you look at the raised Javascript error under the hood, the test page is throwing an error due to unsupported (in M41) ES6 syntax. You can test this yourself in M41, or use the debug snippet we provided in the blog post to log the error into the DOM to see it.”

I believe this is another powerful tool for web developers willing to make their JavaScript websites indexable. We will definitely expand our experiment and work with Ilya’s feedback.

The Fetch and Render tool is the Chrome v. 41 preview

There’s another interesting thing about Chrome 41. Google Search Console’s Fetch and Render tool is simply the Chrome 41 preview. The righthand-side view (“This is how a visitor to your website would have seen the page”) is generated by the Google Search Console bot, which is… Chrome 41.0.2272.118 (see screenshot below).

Zoom in here

There’s evidence that both Googlebot and Google Search Console Bot render pages using Chrome 41. Still, we don’t exactly know what the differences between them are. One noticeable difference is that the Google Search Console bot doesn’t respect the robots.txt file. There may be more, but for the time being, we’re not able to point them out.

Chrome 41 vs Fetch as Google: A word of caution

Chrome 41 is a great tool for debugging Googlebot. However, sometimes (not often) there’s a situation in which Chrome 41 renders a page properly, but the screenshots from Google Fetch and Render suggest that Google can’t handle the page. It could be caused by CSS animations and transitions, Googlebot timeouts, or the usage of features that Googlebot doesn’t support. Let me show you an example.

Chrome 41 preview:

Image blurred for privacy

The above page has quite a lot of content and images, but it looks completely different in Google Search Console.

Google Search Console preview for the same URL:

As you can see, Google Search Console’s preview of this URL is completely different than what you saw on the previous screenshot (Chrome 41). All the content is gone and all we can see is the search bar.

From what we noticed, Google Search Console renders CSS a little bit different than Chrome 41. This doesn’t happen often, but as with most tools, we need to double check whenever possible.

This leads us to a question…

What features are supported by Googlebot and WRS?

According to the Rendering on Google Search guide:

  • Googlebot doesn’t support IndexedDB, WebSQL, and WebGL.
  • HTTP cookies and local storage, as well as session storage, are cleared between page loads.
  • All features requiring user permissions (like Notifications API, clipboard, push, device-info) are disabled.
  • Google can’t index 3D and VR content.
  • Googlebot only supports HTTP/1.1 crawling.

The last point is really interesting. Despite statements from Google over the last 2 years, Google still only crawls using HTTP/1.1.

No HTTP/2 support (still)

We’ve mostly been covering how Googlebot uses Chrome, but there’s another recent discovery to keep in mind.

There is still no support for HTTP/2 for Googlebot.

Since it’s now clear that Googlebot doesn’t support HTTP/2, this means that if your website supports HTTP/2, you can’t drop HTTP 1.1 optimization. Googlebot can crawl only using HTTP/1.1.

There were several announcements recently regarding Google’s HTTP/2 support. To read more about it, check out my HTTP/2 experiment here on the Moz Blog.

Via https://developers.google.com/search/docs/guides/r…

Googlebot’s future

Rumor has it that Chrome 59’s headless mode was created for Googlebot, or at least that it was discussed during the design process. It’s hard to say if any of this chatter is true, but if it is, it means that to some extent, Googlebot will “see” the website in the same way as regular Internet users.

This would definitely make everything simpler for developers who wouldn’t have to worry about Googlebot’s ability to crawl even the most complex websites.

Chrome 41 vs. Googlebot’s crawling efficiency

Chrome 41 is a powerful tool for debugging JavaScript crawling and indexing. However, it’s crucial not to jump on the hype train here and start launching websites that “pass the Chrome 41 test.”

Even if Googlebot can “see” our website, there are many other factors that will affect your site’s crawling efficiency. As an example, we already have proof showing that Googlebot can crawl and index JavaScript and many JavaScript frameworks. It doesn’t mean that JavaScript is great for SEO. I gathered significant evidence showing that JavaScript pages aren’t crawled even half as effectively as HTML-based pages.

In summary

Ilya Grigorik’s tweet sheds more light on how Google crawls pages and, thanks to that, we don’t have to build experiments for every feature we’re testing — we can use Chrome 41 for debugging instead. This simple step will definitely save a lot of websites from indexing problems, like when Hulu.com’s JavaScript SEO backfired.

It’s safe to assume that Chrome 41 will now be a part of every SEO’s toolset.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Raymond Castleberry Blog http://raymondcastleberry.blogspot.com/2017/10/google-shares-details-about-technology.html
via http://raymondcastleberry.blogspot.com

Does Googlebot Support HTTP/2? Challenging Google’s Indexing Claims – An Experiment

Posted by goralewicz

I was recently challenged with a question from a client, Robert, who runs a small PR firm and needed to optimize a client’s website. His question inspired me to run a small experiment in HTTP protocols. So what was Robert’s question? He asked…

Can Googlebot crawl using HTTP/2 protocols?

You may be asking yourself, why should I care about Robert and his HTTP protocols?

As a refresher, HTTP protocols are the basic set of standards allowing the World Wide Web to exchange information. They are the reason a web browser can display data stored on another server. The first was initiated back in 1989, which means, just like everything else, HTTP protocols are getting outdated. HTTP/2 is one of the latest versions of HTTP protocol to be created to replace these aging versions.

So, back to our question: why do you, as an SEO, care to know more about HTTP protocols? The short answer is that none of your SEO efforts matter or can even be done without a basic understanding of HTTP protocol. Robert knew that if his site wasn’t indexing correctly, his client would miss out on valuable web traffic from searches.

The hype around HTTP/2

HTTP/1.1 is a 17-year-old protocol (HTTP 1.0 is 21 years old). Both HTTP 1.0 and 1.1 have limitations, mostly related to performance. When HTTP/1.1 was getting too slow and out of date, Google introduced SPDY in 2009, which was the basis for HTTP/2. Side note: Starting from Chrome 53, Google decided to stop supporting SPDY in favor of HTTP/2.

HTTP/2 was a long-awaited protocol. Its main goal is to improve a website’s performance. It’s currently used by 17% of websites (as of September 2017). Adoption rate is growing rapidly, as only 10% of websites were using HTTP/2 in January 2017. You can see the adoption rate charts here. HTTP/2 is getting more and more popular, and is widely supported by modern browsers (like Chrome or Firefox) and web servers (including Apache, Nginx, and IIS).

Its key advantages are:

  • Multiplexing: The ability to send multiple requests through a single TCP connection.
  • Server push: When a client requires some resource (let’s say, an HTML document), a server can push CSS and JS files to a client cache. It reduces network latency and round-trips.
  • One connection per origin: With HTTP/2, only one connection is needed to load the website.
  • Stream prioritization: Requests (streams) are assigned a priority from 1 to 256 to deliver higher-priority resources faster.
  • Binary framing layer: HTTP/2 is easier to parse (for both the server and user).
  • Header compression: This feature reduces overhead from plain text in HTTP/1.1 and improves performance.

For more information, I highly recommend reading “Introduction to HTTP/2” by Surma and Ilya Grigorik.

All these benefits suggest pushing for HTTP/2 support as soon as possible. However, my experience with technical SEO has taught me to double-check and experiment with solutions that might affect our SEO efforts.

So the question is: Does Googlebot support HTTP/2?

Google’s promises

HTTP/2 represents a promised land, the technical SEO oasis everyone was searching for. By now, many websites have already added HTTP/2 support, and developers don’t want to optimize for HTTP/1.1 anymore. Before I could answer Robert’s question, I needed to know whether or not Googlebot supported HTTP/2-only crawling.

I was not alone in my query. This is a topic which comes up often on Twitter, Google Hangouts, and other such forums. And like Robert, I had clients pressing me for answers. The experiment needed to happen. Below I’ll lay out exactly how we arrived at our answer, but here’s the spoiler: it doesn’t. Google doesn’t crawl using the HTTP/2 protocol. If your website uses HTTP/2, you need to make sure you continue to optimize the HTTP/1.1 version for crawling purposes.

The question

It all started with a Google Hangouts in November 2015.

When asked about HTTP/2 support, John Mueller mentioned that HTTP/2-only crawling should be ready by early 2016, and he also mentioned that HTTP/2 would make it easier for Googlebot to crawl pages by bundling requests (images, JS, and CSS could be downloaded with a single bundled request).

“At the moment, Google doesn’t support HTTP/2-only crawling (…) We are working on that, I suspect it will be ready by the end of this year (2015) or early next year (2016) (…) One of the big advantages of HTTP/2 is that you can bundle requests, so if you are looking at a page and it has a bunch of embedded images, CSS, JavaScript files, theoretically you can make one request for all of those files and get everything together. So that would make it a little bit easier to crawl pages while we are rendering them for example.”

Soon after, Twitter user Kai Spriestersbach also asked about HTTP/2 support:

His clients started dropping HTTP/1.1 connections optimization, just like most developers deploying HTTP/2, which was at the time supported by all major browsers.

After a few quiet months, Google Webmasters reignited the conversation, tweeting that Google won’t hold you back if you’re setting up for HTTP/2. At this time, however, we still had no definitive word on HTTP/2-only crawling. Just because it won’t hold you back doesn’t mean it can handle it — which is why I decided to test the hypothesis.

The experiment

For months as I was following this online debate, I still received questions from our clients who no longer wanted want to spend money on HTTP/1.1 optimization. Thus, I decided to create a very simple (and bold) experiment.

I decided to disable HTTP/1.1 on my own website (https://goralewicz.com) and make it HTTP/2 only. I disabled HTTP/1.1 from March 7th until March 13th.

If you’re going to get bad news, at the very least it should come quickly. I didn’t have to wait long to see if my experiment “took.” Very shortly after disabling HTTP/1.1, I couldn’t fetch and render my website in Google Search Console; I was getting an error every time.

My website is fairly small, but I could clearly see that the crawling stats decreased after disabling HTTP/1.1. Google was no longer visiting my site.

While I could have kept going, I stopped the experiment after my website was partially de-indexed due to “Access Denied” errors.

The results

I didn’t need any more information; the proof was right there. Googlebot wasn’t supporting HTTP/2-only crawling. Should you choose to duplicate this at home with our own site, you’ll be happy to know that my site recovered very quickly.

I finally had Robert’s answer, but felt others may benefit from it as well. A few weeks after finishing my experiment, I decided to ask John about HTTP/2 crawling on Twitter and see what he had to say.

(I love that he responds.)

Knowing the results of my experiment, I have to agree with John: disabling HTTP/1 was a bad idea. However, I was seeing other developers discontinuing optimization for HTTP/1, which is why I wanted to test HTTP/2 on its own.

For those looking to run their own experiment, there are two ways of negotiating a HTTP/2 connection:

1. Over HTTP (unsecure) – Make an HTTP/1.1 request that includes an Upgrade header. This seems to be the method to which John Mueller was referring. However, it doesn’t apply to my website (because it’s served via HTTPS). What is more, this is an old-fashioned way of negotiating, not supported by modern browsers. Below is a screenshot from Caniuse.com:

2. Over HTTPS (secure) – Connection is negotiated via the ALPN protocol (HTTP/1.1 is not involved in this process). This method is preferred and widely supported by modern browsers and servers.

A recent announcement: The saga continues

Googlebot doesn’t make HTTP/2 requests

Fortunately, Ilya Grigorik, a web performance engineer at Google, let everyone peek behind the curtains at how Googlebot is crawling websites and the technology behind it:

If that wasn’t enough, Googlebot doesn’t support the WebSocket protocol. That means your server can’t send resources to Googlebot before they are requested. Supporting it wouldn’t reduce network latency and round-trips; it would simply slow everything down. Modern browsers offer many ways of loading content, including WebRTC, WebSockets, loading local content from drive, etc. However, Googlebot supports only HTTP/FTP, with or without Transport Layer Security (TLS).

Googlebot supports SPDY

During my research and after John Mueller’s feedback, I decided to consult an HTTP/2 expert. I contacted Peter Nikolow of Mobilio, and asked him to see if there were anything we could do to find the final answer regarding Googlebot’s HTTP/2 support. Not only did he provide us with help, Peter even created an experiment for us to use. Its results are pretty straightforward: Googlebot does support the SPDY protocol and Next Protocol Navigation (NPN). And thus, it can’t support HTTP/2.

Below is Peter’s response:


I performed an experiment that shows Googlebot uses SPDY protocol. Because it supports SPDY + NPN, it cannot support HTTP/2. There are many cons to continued support of SPDY:

  1. This protocol is vulnerable
  2. Google Chrome no longer supports SPDY in favor of HTTP/2
  3. Servers have been neglecting to support SPDY. Let’s examine the NGINX example: from version 1.95, they no longer support SPDY.
  4. Apache doesn’t support SPDY out of the box. You need to install mod_spdy, which is provided by Google.

To examine Googlebot and the protocols it uses, I took advantage of s_server, a tool that can debug TLS connections. I used Google Search Console Fetch and Render to send Googlebot to my website.

Here’s a screenshot from this tool showing that Googlebot is using Next Protocol Navigation (and therefore SPDY):

I’ll briefly explain how you can perform your own test. The first thing you should know is that you can’t use scripting languages (like PHP or Python) for debugging TLS handshakes. The reason for that is simple: these languages see HTTP-level data only. Instead, you should use special tools for debugging TLS handshakes, such as s_server.

Type in the console:


sudo openssl s_server -key key.pem -cert cert.pem -accept 443 -WWW -tlsextdebug -state -msg
sudo openssl s_server -key key.pem -cert cert.pem -accept 443 -www -tlsextdebug -state -msg

Please note the slight (but significant) difference between the “-WWW” and “-www” options in these commands. You can find more about their purpose in the s_server documentation.

Next, invite Googlebot to visit your site by entering the URL in Google Search Console Fetch and Render or in the Google mobile tester.

As I wrote above, there is no logical reason why Googlebot supports SPDY. This protocol is vulnerable; no modern browser supports it. Additionally, servers (including NGINX) neglect to support it. It’s just a matter of time until Googlebot will be able to crawl using HTTP/2. Just implement HTTP 1.1 + HTTP/2 support on your own server (your users will notice due to faster loading) and wait until Google is able to send requests using HTTP/2.


Summary

In November 2015, John Mueller said he expected Googlebot to crawl websites by sending HTTP/2 requests starting in early 2016. We don’t know why, as of October 2017, that hasn’t happened yet.

What we do know is that Googlebot doesn’t support HTTP/2. It still crawls by sending HTTP/ 1.1 requests. Both this experiment and the “Rendering on Google Search” page confirm it. (If you’d like to know more about the technology behind Googlebot, then you should check out what they recently shared.)

For now, it seems we have to accept the status quo. We recommended that Robert (and you readers as well) enable HTTP/2 on your websites for better performance, but continue optimizing for HTTP/ 1.1. Your visitors will notice and thank you.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Raymond Castleberry Blog http://raymondcastleberry.blogspot.com/2017/10/does-googlebot-support-http2.html
via http://raymondcastleberry.blogspot.com

Clean Power Plan Repeal: Myth vs. Reality

With EPA administrator Scott Pruitt’s announcement that the Trump Administration was formally proposing repeal of the so-called “Clean Power Plan” (CPP), certain voices in the blogosphere and media predictably went nuts. In the formal response from IER, we have already applauded the announcement as promoting liberty in energy markets and keeping energy more affordable for American households. In the present post, let me further respond to some of the (hysterical) reactions that are based on myths.

Myth #1: “The Obama Administration never started a ‘war on coal.’ This is a bogus GOP talking point.”

Here it would be harder to find a smokier gun than then-presidential candidate Senator Barack Obama, speaking in a public forum to the San Francisco Chronicle back in January 2008. In this clip he says, “…understanding what is at stake, and climate change is a great example. You know when I was asked earlier about the issue of coal. You know, under my plan, of a cap-and-trade system, electricity rates would necessarily skyrocket…”

And then in this clip, in what has become an infamous line, Obama says of his proposed cap-and-trade system, “So if somebody wants to build a coal-powered plant, they can. It’s just that it will bankrupt them because they’re going to be charged a huge sum for all that greenhouse gas that’s being emitted.”

Now to be fair, in the clip I’ve hyperlinked, you can see the full context of that notorious statement, where a few moments earlier Obama says he is open to the idea of coal-fired plants so long as all of the greenhouse gas emissions are captured. Yet given the current technology and cost considerations, to insist on emission-free coal is effectively a ban on new coal-fired plants.

Indeed, Obama’s allies knew that this was the case. Writing in 2013, here is economist Paul Krugman, explaining why direct regulation to prohibit coal is a defensible policy, given the political realities:

As I’ve just suggested, the standard economic argument for emissions pricing comes from the observation that there are many margins on which we should operate.…Nonetheless, the message I took from [a book by William Nordhaus] was that direct action to regulate emissions from electricity generation would be a surprisingly good substitute for carbon pricing—not as good, but not bad.

And this conclusion becomes especially interesting given the current legal and political situation in the United States, where nothing like a carbon-pricing scheme has a chance of getting through Congress at least until or unless Democrats regain control of both houses, whereas the Environmental Protection Agency has asserted its right and duty to regulate power plant emissions, and has already introduced rules that will probably prevent the construction of any new coal-fired plants. Taking on the existing plants is going to be much tougher and more controversial, but looks for the moment like a more feasible path than carbon pricing. [Paul Krugman 2013, bold added.]

And there you have it: Paul Krugman was admitting in 2013 that theoretically, an open-ended government “price” on carbon would be preferable, but that in practice Krugman endorsed the EPA’s top-down planning of the energy sector, including the power plant rules that would “probably prevent the contruction of any new coal-fired plants.” And, far from contenting himself with stopping the construction of new plants, Krugman went on to hope that the federal government could “tak[e] on the existing plants.”

So when the fans of open markets in the energy sector complain of a “war on coal,” they aren’t imagining things. Leading figures, including Barack Obama and Nobel laureate Paul Krugman, publicly declared their opposition to coal-fired power plants.

Myth #2: “The EPA’s power plant rules wouldn’t have hurt coal. It was natural gas prices that would hurt coal.”

If this were true, then it wouldn’t make any sense for critics to complain about President Trump’s removal of the CPP, would it? Once again, some of the loudest environmental activists try to have it both ways. On the one hand, measures like the CPP are essential to ensuring that our grandchildren survive the ravages of climate change, while on the other hand, these regulations apparently have no impact whatsoever on energy sources or prices for consumers. These environmentalists need to make up their minds.

It is certainly true that falling natural gas prices are part of the reason the US has shifted some of its electricity generation away from coal and into gas-fired plants. Even so, the CPP was projected to have a serious long-term impact on coal generation.

As I explained in this previous IER post, we can use the EIA’s 2017 long-term energy outlook to get a sense of the government’s own forecast of the CPP’s impact. The following chart from the EIA shows two scenarios, with and without the CPP in force:

 

In the chart above, the right-hand side shows coal’s generation staying roughly level from 2020 through 2040, in the case with no Clean Power Plan.

Yet on the left side, the “Reference case” with the CPP staying in force, we see electricity generation from coal fall significantly from 2020 to 2040, by almost 500 billion kilowatt-hours per year, or about a third.

To be clear, these EIA forecasts include assumptions about coal, natural gas, and renewables pricing and technological breakthroughs. Even so, the CPP in these forecasts made the difference between coal-generated electricity output holding steady versus falling by a third.

We at IER are not in the business of picking winners and losers in the energy sector. If coal loses market share because of developments in hydraulic fracturing and horizontal drilling, then consumers will benefit from more affordable energy.[1]

However, if coal is hampered by government regulations and/or taxes, then this makes energy more expensive. It does not represent innovation or a boon to consumers.

Myth #3: “The Clean Power Plan was essential to the battle against climate change.”

Earlier I pointed out that many critics of the Trump Administration were being inconsistent: On the one hand, they pooh-poohed the warnings that the CPP was hurting the coal sector. On the other hand, they went ballistic saying that the CPP was essential to stop climate change. Those positions can’t both be true.

However, the mirror-image of these claims can be true. Specifically, even though it’s true that the CPP would reduce US coal-fired power generation significantly, it does not follow that the CPP would significantly impact climate change.

For example, climate scientists Pat Michaels and Chip Knappenberger used a standard computer model to estimate that the Clean Power Plan, if it remained in force, would at most have made the global temperature in the year 2100 a mere 0.019 degrees Celsius lower than it otherwise would have been.

Conclusion

Some of President Trump’s most vociferous critics vacillate between mocking him for doing nothing, and freaking out because he’s doing what he said he’d do. No one can deny that Trump campaigned on ending the “war on coal.” The administration’s action on the so-called Clean Power Plan is consistent with that pledge. Contrary to the myths being floated by pundits and bloggers, there really was a war on coal, the fortunes of coal weren’t just due to natural gas prices, and ending the CPP won’t make a big difference to measured climate change.


[1] Some critics allege that “fracking” involves violations of property rights of local landowners. If this were the case, then the practice would not be beneficial to all consumers. Our statements in the text refer to a situation where all market transactions are voluntary.

The post Clean Power Plan Repeal: Myth vs. Reality appeared first on IER.

from Raymond Castleberry Blog http://raymondcastleberry.blogspot.com/2017/10/clean-power-plan-repeal-myth-vs-reality.html
via http://raymondcastleberry.blogspot.com

Clean Power Plan Repeal: Myth vs. Reality

With EPA administrator Scott Pruitt’s announcement that the Trump Administration was formally proposing repeal of the so-called “Clean Power Plan” (CPP), certain voices in the blogosphere and media predictably went nuts. In the formal response from IER, we have already applauded the announcement as promoting liberty in energy markets and keeping energy more affordable for American households. In the present post, let me further respond to some of the (hysterical) reactions that are based on myths.

Myth #1: “The Obama Administration never started a ‘war on coal.’ This is a bogus GOP talking point.”

Here it would be harder to find a smokier gun than then-presidential candidate Senator Barack Obama, speaking in a public forum to the San Francisco Chronicle back in January 2008. In this clip he says, “…understanding what is at stake, and climate change is a great example. You know when I was asked earlier about the issue of coal. You know, under my plan, of a cap-and-trade system, electricity rates would necessarily skyrocket…”

And then in this clip, in what has become an infamous line, Obama says of his proposed cap-and-trade system, “So if somebody wants to build a coal-powered plant, they can. It’s just that it will bankrupt them because they’re going to be charged a huge sum for all that greenhouse gas that’s being emitted.”

Now to be fair, in the clip I’ve hyperlinked, you can see the full context of that notorious statement, where a few moments earlier Obama says he is open to the idea of coal-fired plants so long as all of the greenhouse gas emissions are captured. Yet given the current technology and cost considerations, to insist on emission-free coal is effectively a ban on new coal-fired plants.

Indeed, Obama’s allies knew that this was the case. Writing in 2013, here is economist Paul Krugman, explaining why direct regulation to prohibit coal is a defensible policy, given the political realities:

As I’ve just suggested, the standard economic argument for emissions pricing comes from the observation that there are many margins on which we should operate.…Nonetheless, the message I took from [a book by William Nordhaus] was that direct action to regulate emissions from electricity generation would be a surprisingly good substitute for carbon pricing—not as good, but not bad.

And this conclusion becomes especially interesting given the current legal and political situation in the United States, where nothing like a carbon-pricing scheme has a chance of getting through Congress at least until or unless Democrats regain control of both houses, whereas the Environmental Protection Agency has asserted its right and duty to regulate power plant emissions, and has already introduced rules that will probably prevent the construction of any new coal-fired plants. Taking on the existing plants is going to be much tougher and more controversial, but looks for the moment like a more feasible path than carbon pricing. [Paul Krugman 2013, bold added.]

And there you have it: Paul Krugman was admitting in 2013 that theoretically, an open-ended government “price” on carbon would be preferable, but that in practice Krugman endorsed the EPA’s top-down planning of the energy sector, including the power plant rules that would “probably prevent the contruction of any new coal-fired plants.” And, far from contenting himself with stopping the construction of new plants, Krugman went on to hope that the federal government could “tak[e] on the existing plants.”

So when the fans of open markets in the energy sector complain of a “war on coal,” they aren’t imagining things. Leading figures, including Barack Obama and Nobel laureate Paul Krugman, publicly declared their opposition to coal-fired power plants.

Myth #2: “The EPA’s power plant rules wouldn’t have hurt coal. It was natural gas prices that would hurt coal.”

If this were true, then it wouldn’t make any sense for critics to complain about President Trump’s removal of the CPP, would it? Once again, some of the loudest environmental activists try to have it both ways. On the one hand, measures like the CPP are essential to ensuring that our grandchildren survive the ravages of climate change, while on the other hand, these regulations apparently have no impact whatsoever on energy sources or prices for consumers. These environmentalists need to make up their minds.

It is certainly true that falling natural gas prices are part of the reason the US has shifted some of its electricity generation away from coal and into gas-fired plants. Even so, the CPP was projected to have a serious long-term impact on coal generation.

As I explained in this previous IER post, we can use the EIA’s 2017 long-term energy outlook to get a sense of the government’s own forecast of the CPP’s impact. The following chart from the EIA shows two scenarios, with and without the CPP in force:

 

In the chart above, the right-hand side shows coal’s generation staying roughly level from 2020 through 2040, in the case with no Clean Power Plan.

Yet on the left side, the “Reference case” with the CPP staying in force, we see electricity generation from coal fall significantly from 2020 to 2040, by almost 500 billion kilowatt-hours per year, or about a third.

To be clear, these EIA forecasts include assumptions about coal, natural gas, and renewables pricing and technological breakthroughs. Even so, the CPP in these forecasts made the difference between coal-generated electricity output holding steady versus falling by a third.

We at IER are not in the business of picking winners and losers in the energy sector. If coal loses market share because of developments in hydraulic fracturing and horizontal drilling, then consumers will benefit from more affordable energy.[1]

However, if coal is hampered by government regulations and/or taxes, then this makes energy more expensive. It does not represent innovation or a boon to consumers.

Myth #3: “The Clean Power Plan was essential to the battle against climate change.”

Earlier I pointed out that many critics of the Trump Administration were being inconsistent: On the one hand, they pooh-poohed the warnings that the CPP was hurting the coal sector. On the other hand, they went ballistic saying that the CPP was essential to stop climate change. Those positions can’t both be true.

However, the mirror-image of these claims can be true. Specifically, even though it’s true that the CPP would reduce US coal-fired power generation significantly, it does not follow that the CPP would significantly impact climate change.

For example, climate scientists Pat Michaels and Chip Knappenberger used a standard computer model to estimate that the Clean Power Plan, if it remained in force, would at most have made the global temperature in the year 2100 a mere 0.019 degrees Celsius lower than it otherwise would have been.

Conclusion

Some of President Trump’s most vociferous critics vacillate between mocking him for doing nothing, and freaking out because he’s doing what he said he’d do. No one can deny that Trump campaigned on ending the “war on coal.” The administration’s action on the so-called Clean Power Plan is consistent with that pledge. Contrary to the myths being floated by pundits and bloggers, there really was a war on coal, the fortunes of coal weren’t just due to natural gas prices, and ending the CPP won’t make a big difference to measured climate change.


[1] Some critics allege that “fracking” involves violations of property rights of local landowners. If this were the case, then the practice would not be beneficial to all consumers. Our statements in the text refer to a situation where all market transactions are voluntary.

The post Clean Power Plan Repeal: Myth vs. Reality appeared first on IER.