Is CopyPasted JS CSS to Theme Files better in SEO than Embedded the Links?



Nallawalla
Asking in a fresh thread:
Martinez wrote: //The page speed score is most affected by the number of resources the page uses. Each resource requires an additional fetch request between client and server. Hence, you can …
Remove embedded images and videos;
Embed all Javascript in the page itself;
Embed all CSS in the page itself;
Use default fonts.
Most people are unwilling to do these things, but they have the biggest impact.//
This makes me wonder about which is the lesser of the two evils:
1. Place common, reusable code in separate JS and CSS files to avoid reloading for each page (old philosophy of page speed)
2. Copy JS and CSS to each page to reduce fetch requests (new way to speed up).
I am guessing that the new thinking suits pages that are intended to be landing+conversion pages, i.e. organic search lands on a page and converts, rather than a visitor on a government site where they might sit and browse many pages.
2 πŸ‘πŸ½27 πŸ’¬πŸ—¨

πŸ“°πŸ‘ˆ

Adam J. Humphreys πŸ‘‘
We call this page calls. Less is definitely more. However, on https2 you definitely don't want files all combined over a certain size.
Michael Martinez πŸ‘‘
The old way of thinking originated in a world of dial-up modems, when the longer it took to get the basic page to load, the more likely people would abandon the site.
Even slow mobile connections tend to be much faster than dial-up systems. Now, there are still some dial-up systems in use. But for the most part people expect to enjoy superfast speeds compared to the old modem days.
So overall the best design philosophy is to pack as much data into a single fetch as possible, rather than make the browser wait for fetch after fetch to execute. That not only slows down the client, it slows down a busy server.
Embedding Javascript and CSS code in the page will always win on today's Internet.
I'm not sure what the problem with HTTP/2 would be, but it does combine the fetch streams into a single stream so most people won't have a problem with multiple fetches.

Ammon Johns πŸ‘‘ Β» Michael Martinez
Embedded/inline CSS and JS can't be cached. External files are. So while what you say above is great for a single view and bounce session, it is terrible for multi-page visits, both for the user, and for the server.
Michael Martinez πŸ‘‘ Β» Ammon Johns
Not really. The time it takes to download 1 megabyte versus, say, 500,000 bytes plus make 2 or more connections to download the other 500,000 bytes in multiple files is much shorter.
And Google's page speed tools only use simulators estimating the worst-case scenario. So anyone who actually judges page speed by those tools should take that into consideration.
Marketers do a lot of things in the name of "speeding up sites" that are actually discourteous to their mobile users, such as pre-loading content from sibling pages (adding more fetches to the user experience, and thus using more cellular bandwidth).
Overall, the fewer fetches the better because most people are not running on HTTP/2 connections and each connection costs a lot in processing time and bandwidth.


Adam J. Humphreys πŸ‘‘
You can ask me for a list and I will compile it for you if you'd like. I just built a site that over the last 30 days got average 0.5s desktop load time and 1 second mobile. 99/100 mobile, 100/100 desktop without using Nitropack. I should be able to go a little faster with SVG but for security reasons industry is moving away from this.
Richard Hearne πŸ‘‘
I rarely disagree with Michael Martinez, but in this case I will. I work a lot with larger sites on this problem, and while the number of third party resources does often correlate with poor performance, I've found that JS execution is the biggest enemy. Again, this also correlates with third party resources, but it can also be caused by first party resources. I'll go a step further and say that JavaScript is general is the leading cause of poor client-side performance. It's very rare for a JS-free page to have poor pagespeed scores.

Michael Martinez πŸ‘‘ Β» Richard Hearne
Well, I agree that a lot of Javascript does things that are just virtual bells and whistles for the developer.
But as someone who has written communications software, right down to the handshaking protocols, I'm gonna stand by what I wrote above. The fewer fetch requests, the better. And those requests represent most of what Google is scoring pages badly for.
You can easily speed up the assessed speed of an image-laden page by removing the images or combining them into a sprite. That does nothing for the processing time required to render those images, but the page speed tools acknowledge the huge time savings in reduced number of fetches.

πŸ“°πŸ‘ˆ

Ammon Johns πŸ‘‘
Accessibility says option 1 – because not ALL clients will download the CSS or JS files at all, and for those that is by FAR the best option.

Ammon Johns πŸ‘‘
Additionally, on a second page view (i.e. any visit that is not a one page bounce), the CSS and JS will be cached using method 1, so the page is far faster. If embedded as per option 2, this is lost, and a major impact on overall user experience.
Michael Martinez πŸ‘‘ Β» Ammon Johns
Again, not really that big an impact. It doesn't take that long to download 1-2 megs of data even on a slow cellular connection.
The user experience is most improved with caching IF the user stays on the site and visits more pages – always a lofty goal for any marketer.
But the processing cost to both client and server of handling multiple fetch requests is higher than pulling fewer moderately-sized files across the wireless medium.
Many of Google's recommendations are ridiculous, too. You're not improving the user experience by compressing a 25K image down to 17K. And when sites use Google fonts, there isn't anything they can do to improve all the DNS lookups as opposed to only using native browser fonts. But Google recommends improving loading times for those fonts anyway.
Ammon Johns πŸ‘‘ Β» Michael Martinez
Google are not, and never have been, an accessibility body. Hell, they don't give a rat's ass for standards most of the time. Not until they can find a way to leverage it for their own gain and use it to gather more data, or lock rivals out of data. [Not Provided, for example].
So, respectfully, what Google do or do not say about it really doesn't interest me other than how they score it for ranking purposes. You know, that stuff you are arguing against and calling ridiculous like CWV and PageSpeed.
On the accessibility/usability side, this isn't even a debate. External files all the way so user_agents that can't even use scripts or CSS don't waste bandwidth on the completely pointless.
Makes a big old difference to servers too. Browser level caching is a huge deal for server loading, and when one needs to expand the server farm.
Browser-level caching has always been one of the first layers of reducing wasted bandwidth and improving user experience.
Now, while it is wonderful to learn you've never had to use 4G or worse, and never ever had less than full reception on your smartphone, I promise you that puts you in a really freaking rare category.
For the rest of us, caching JS and CSS can make a difference big enough to notice, and notice significantly, in many, many real-world cases. Especially on mobile. Or on shared bandwidth. Or the less than stellar internet access that large swathes of the global population must deal with.
Like I said, in accessibility and usability terms there is not even a debate. It is option 1 all the way, just reducing size, and the number of calls for those external files, also using compression, etc.
Michael Martinez πŸ‘‘ Β» Ammon Johns
Again, browser-level caching only kicks in AFTER the 2nd page request. On the 1st request there is no browser caching.
And Google is only judging "page speed" on the assumed basis of a 1st-page load. They never take caching into consideration.
As far as user-agents that ignore markup, it's not hurting anything for them to download the extra inline markup. The extra time used to transmit a few thousand (to, at worst, a few tens of thousands) of bytes of text is negligible over any connection of 3 megabits-per-second on up. 3G speeds are rated at 7.2 Mbps – and 3G has been turned off by most networks.
3G HSPA+ is rated at 42 Mbps. That's more than enough speed to handle inline markup that will be ignored.
Anyone using a cms like WordPress won't have many options for choosing between all this stuff anyway. We're arguing about esoteric magic spells, here.
As far as Google's page speed measurements are concerned, eliminating fetches is the easiest way to measure up to their insane expectations. It also has the effect of improving real-world user experience much better than the majority of Google's recommendations.
Basically, if you get rid of the images, use native fonts, and keep the CSS minimal (and I'm not talking about "minification"), then everyone is happy.
That doesn't happen often in the real world, but it's a business decision, not a constraint imposed by any of the technologies.
Ammon Johns πŸ‘‘ Β» Michael Martinez
Yeah, I know how caching works. So I also know how using the most popular framework on the most popular CDN works – there's a fairer chance than you'd think someone already has it cached.
Enough so that I ran an experiment with a couple of my test sites of using the most popular CDN for the most common config of Bootstrap, rather than strip bootstrap to only what I use on the specific site (always my choice before). The results were good.
With Bootstrap still being a common base for so many WP themes at the time it really did seem that a notable proportion of visitors had it cached before the *first* page view. Not a majority still, but enough to really consider if the topic matter you draw a lot of visits for often comes in the midst of a browsing session, or if your demographic don't often flush their cache.
I had considered more use of inline data for images, but with the need to really serve one optimized for the size they'll see it at, the code to manage that was just too much work. Worth it for smaller images of a fixed size, but sadly not so good for larger, main images where serving the right size, rather than rescaling, just seemed safer.
Michael Martinez πŸ‘‘ Β» Ammon Johns
I'll concede 2 points here, and hopefully that will satisfy everyone.
1st point: There are many ways to approach the problem and many effective solutions.
2nd point: I've probably said all I need say about reducing fetches because at this point I think I'll just end up repeating myself.
I hope this discussion has proven in some way useful to people seeking new ideas, new understanding, regardless of what directions they take with their site speeds.

πŸ“°πŸ‘ˆ

Truslow
The right balance comes from, IMO, making the fewest external requests as possible while still leveraging the cache. I try not to put all the CSS on the page, but I do spend some time trying to consolidate the CSS from dozens of plugins that result in dozens of CSS calls. If I can make that ONE call, get it all – and then ALSO leverage the cache on the next load – I win twice.
It doesn't have to be an all-or-nothing / one-or-the-other solution.
Dixon
The lesser evil is >> 1. Place common, reusable code in a separate file. << my opinion is not me, but from Martin Splitt at a Google conference in Zurich… I am just passing the message on(ish)

Michael Martinez πŸ‘‘ Β» Dixon
1 CSS and 1 JS file isn't much of a sacrifice to the user experience (in terms of adding extra fetches).
Most of the problems Google's tools complain about are more esoteric than that. With all the images people throw onto their pages, they're not improving the user experience much at all by following Google's advice. Compressing image files that are under 50K in size doesn't do anything significant for loading/rendering on an HTTP/1.1 connection because every fetch must be processed separately.
The real problem here is that Google's tools are designed around a fantasy of "worst possible download speed + worst possible client CPU" combined with unvalidated automated advice.
People have wasted way too many cycles on these page speed issues for obtaining a relatively minor ranking boost that in most cases doesn't do anything for real-world user experience. I've sat in many a parking lot and tested pages on a mobile phone that Google says are "fast" – and they still take up to 10 seconds to display anything because the browser sits there and downloads a lot of stuff (separately).
So, with all due respect to Martin's knowledge of Google's systems, his advice appears to lack real-world testing experience.
πŸ’ŸπŸ‘πŸ½2
Adam J. Humphreys πŸ‘‘
http 2 is better with more files under a certain file size. What size is a testing thing for sure.

πŸ“°πŸ‘ˆ



Ammon: Google don’t count Words or Keywords, but they count Page Speed

Each Inner Page You Wrote Needs at Least a Backlink



Leave a Reply

Your email address will not be published. Required fields are marked *