j.kaspar Posted June 23, 2017 Share Posted June 23, 2017 Hi, I am trying to solve (from my research) well known problem with the render-blocking ..._all.css = complete css, combined into one file, that google's pageSpeed insights keeps complaining about. I tried to make it asynchronous, but it doesn't make any sense. Objects, labels, pictures jumping here and there, while the page is loading. The PageSpeed Insights stops complaining then, but there is no real effect and if there is one, it is negative. The page is still not usable, until the css is prety much completely loaded. Turning the CSS smart cache off and changing the header.tpl to load crucial css files without async and some with it, looks a bit difficult to me, plus I am not sure, that the result will be according to the expectations... Did anyone manage to solve this? Link to comment Share on other sites More sharing options...
j.kaspar Posted June 26, 2017 Author Share Posted June 26, 2017 So there is no solution? Link to comment Share on other sites More sharing options...
Scully Posted June 26, 2017 Share Posted June 26, 2017 Not a solution that I am aware of. But I can confirm that asynchronus load is not an option. Too many thread offs. Link to comment Share on other sites More sharing options...
El Patron Posted June 26, 2017 Share Posted June 26, 2017 Hi, great topic. In 1.6 there is no support for async/defer attributes. I have old module (sort of famous) I wrote for 1.5 JavaPro, would have made a lot of money except 1.6 came out with .js load bottom lol. A little known feature of JavaPro that I wrote the capability to set by .js script an async/defer. Anyway...I mention this only to give background of my effort in .js eco and speeding up the 'above the fold' page render. So with 1.6 ps supported .js bottom and optionally CCC'ed. CCC is a great idea but poorly implemented in 1.6 and earlier version in that each 'major' page has different CCC file. CCC'ed files were only helpful if same major page was visited during shopping experience. I wrote an experimental module jssupercache that would learn js used by shop and consolidate this into a single CCC file. The performance results were incredible, upon 'new' visit there was one CCC .js for all pages but over time. i.e. could not be used in a production environment but proved that if one CCC vs. 'many' CCC files was 'signifcantly' faster and that the one CCC file was only 10% larger so preload was not an issue. in PrestaShop 1.7 they introduced the asset manager. This allows the developer to specify characteristics of their asset, priority, defer, async. 1.7 also implements the one CCC of .js/.css (this is a very big deal). Post new visitor browser cache of 1.7 CCC'ed .js/.css creates IMHO the fastest 'above the fold' page render of 'any' CMS on the planet. Also developers can specify that their registering asset asyn/defer. Defer being the nirvana as async requires precedence. My advice from experience is to move to 1.7 when applicable to your business so you have 'best in class' framework for asset loads. Happy day, el Link to comment Share on other sites More sharing options...
Dh42 Posted June 26, 2017 Share Posted June 26, 2017 The render blocking / defer parsing of page speed will be phased out soon to allow for more meaningful metrics. I would not worry about it too much. 1 Link to comment Share on other sites More sharing options...
El Patron Posted June 26, 2017 Share Posted June 26, 2017 (edited) The render blocking / defer parsing of page speed will be phased out soon to allow for more meaningful metrics. I would not worry about it too much. Can you explain what you mean? appls decide their priority and if their asset are candidates for top/async/defer as part of their dependency management. It is not possible for cms to manage unless they force a particular methodolgy like .js bottom was handled by 1.6. At is concerns CCC, pruning out assets from CCC that are async/defer is desirable but one does not need to run CCC once appls really use dependency manager and well formed assets. Edited June 26, 2017 by El Patron (see edit history) Link to comment Share on other sites More sharing options...
Dh42 Posted June 26, 2017 Share Posted June 26, 2017 Sure, I can take a stab at it, its really something that can be blog post sized, but I think I can squeeze it into a forum post. What does render blocking actually mean? At what point is something blocking render? The way Google Page Speed currently measures render blocking scripts is by the size of the script and if it is before content. Sounds reasonable, but it is a really bad practice. What they do currently is pick an arbitrary number, around 100kb actually. If your stylesheet or script is over that, then it is considered render blocking. The reason that this is a bad practice is if you have a 100kb css file that is being loaded, the loading is almost instantaneous, but the problem comes with repaints. Css renders from top to bottom, so when you have something at the bottom affecting the top of the file, you will get a repaint lag. Sometimes on sites they are visible by elements jumping around. Any developer can make a 25kb css file that has overrides all through the file that will slow the loading of a page, yet Google Page Speed will give you a thumbs up on this, even though it takes 5 seconds to render the site. The same can happen with poorly written javascript, it can be small, but so poorly written that it hangs the page up. This is why measuring file size is bad. A more meaningful metric is the first meaningful paint of the page, which is the first paint where all elements of the page that are not controlled by javascript are presented in their 100% css rendered state. This is timed after all of the css is loaded and rendered. What makes this more meaningful is this is the time that it takes a user from clicking on a link to seeing a page that they can understand. To further expound on that, there is also what is called the first interactive. This is the point from clicking on a link that a person can interact with your site. I know at some point in our lives we have all visited a webpage that looked like it was loaded, but you could not scroll or interact with the site yet. Both of these metrics are impossible to tell just by measuring file size, since it does not take into account any of the processing of the files. How complex the files are and how long it actually takes them to render output. Google see this short coming actually and they are pushing ahead with a new better way. The Lighthouse project is their new way. https://developers.google.com/web/tools/lighthouse/ This is something they believe in so much it was actually announced that it would be in Chrome to help web developers out, https://developers.google.com/web/updates/2017/05/devtools-release-notes If you install the plugin you can see that your metrics are more meaningful when you run it against your site. You can see how long it actually takes your webpage from click to fully rendered, something Page Speed could never give you. You can see from click to fully interactive as well. It even tells you the interactive latency, which is really important if you are running intensive scripts in the background. Lets look at a page for example, this is the official PrestaShop 1.7 demo, ( I ran the test twice and picked the best out of the two runs) https://www.screencast.com/t/FoR3zbFmIIr0 67% is not that great. The first meaningful paint is happening at 4.7 seconds. That 4 seconds of white screen and waiting. Then after the site renders you are at 5.7 seconds before you can actually use it. Here is from a project I am working on, https://www.screencast.com/t/2ppp9eQeKgS This site has a 94% it only takes 1.5 seconds before the first render. Then at 3.1 seconds it is ready to use. Which site would you prefer to be yours? These are the more meaningful metrics I am talking about, these are the metrics that Google is starting to push, not the pagespeed metric. 4 Link to comment Share on other sites More sharing options...
El Patron Posted June 26, 2017 Share Posted June 26, 2017 (edited) Thanks for feedback. For now we work in barbaric but PS has implemented the current best practice for existing architecture in open source arena. It's very good to know what 'might' be coming in the future but for the here and now I'm not convinced that PS did pick best method currently available. Because of my interest in 'above the fold' and experience of what is required to remove 'all' blocking PS picked the right horse for today. If/when other techniques become accepted in the future I'm confident PS will continue to set the standard for a cms by including as future feature. Now it's all about developing in a non-ps framework that ps supports. I respectfully disagree about what google measures and weighs as most signficant performance signal. Google measures 'above the fold' and to a much lesser degree how long it tool to load the footer. https://www.google.com/patents/US20070118640 As for running/viewing any 'performance metrix', that has so many variables specifically in mysql area that I cannot address any results from tests as they are not measuring the actual above the fold as seen by the visitor. Which is what Google is measuring and what enhances visitor experience. Moving forward with open source development module bundlers like webpack and smyfony compatibility while in it's infancy is a good start given no other choices out there and again the here and now must be addressed. Edited June 27, 2017 by El Patron (see edit history) Link to comment Share on other sites More sharing options...
Dh42 Posted June 26, 2017 Share Posted June 26, 2017 Honestly, the above the fold means nothing. Its a 5 year old term that was used for a short period of time for desktops, but it not the main market anymore. There was a time when pages rendered down, a time before javascript and jquery permeated the web. Now that is no longer a concern. The concern is coming full circle to displaying and interactivity. Lots of things are taken into account though with the timings, it is not totally front end render. The project I am working on does have optimized database queries, but none the less it does run on a $10 a month 1gb memory server. Me and my company like to look at the future when we plan. I want my clients to be optimized for 6 months from now when things change. Not to optimize them for something I know is falling out of favor to be replaced shortly. That is, if it has not already been replaced. Nothing in this industry stagnates, one thing that is amazing to me about it is the technology changes that introduce perceived regressions. A really good example is combining files into one file. That has been the defacto standard for years. No because of the advent of http/2 and its almost total adoption, it is no longer the best practice. Link to comment Share on other sites More sharing options...
j.kaspar Posted June 27, 2017 Author Share Posted June 27, 2017 Very interesting and illuminating discussion. Upgrading to PS 1.7 is something I thought I will never even consider. In fact, after all I have read about 1.7, I thought, that in the future, I will switch to thirty bees, as it sounds promissing to have upgraded and fine tuned PS 1.6. What is written here, brings me a completely different point of view. So, to sum it up - there is no way how to achieve it in PS 1.6, but sooner or later, it will not matter. I think I am going to stay with PS 1.6 for a while, because I invested a lot of work to customize the template and other stuff, just "a while" ago, so I am not really eager to do it again so soon Thank you 1 Link to comment Share on other sites More sharing options...
Dh42 Posted June 27, 2017 Share Posted June 27, 2017 No, it can be achieved and it is being worked on now. Many months ago I gave the unconcerned development team from PrestaShop a way to do it on Gitter. They stuck to their guns on the way they were using and would not even consider changing it. So I have taken this method with me and started working on it. See the solution is simple, but it requires theme devs and module makers to play along as well. What you have to do is further shard the bootstrap library. Once it is further sharded you can compile it from partials into how you want. At the same time you introduce new hooks for css files and for js files. Basically what you are wanting to do is introduce a top hook and a bottom hook for each file type. Then you need your theme and module developers to respect those hooks. Have your top menu load after your bootstrap library so that the top hook css is loaded first, then have the lower hook load the remaining lower css. That is the way that you can do it. But, like I said, render blocking is only one small part of the story. It does not matter if your css does not block render if it takes your site 8 seconds to start sending data, people are going to leave in the first couple of seconds. That is why you need an all around approach and why the demo I showed loads faster and has more features on the frontend than the 1.7 demo. 1 Link to comment Share on other sites More sharing options...
El Patron Posted June 27, 2017 Share Posted June 27, 2017 we are talking about render blocking of assets so IMHO PS rules the pool with new asset manager. as a developer I pretty much gave up on ps until 1.7 (template inheritance), as a performance geek the ability as a dev to define asset attributes is huge.... what will happen in ps eco with new dev environment? modules that will support features found in major ecomerce systems that could not be created other than via custom per shop code. for me 1.7 was about creating a dev framework that integrates best practices in ps. now we will see much richer ps module eco that help shop mangers grow their businesses....any cms not employing similar dev tools and more importantly template inheritance will become irrelevant soon. 1 Link to comment Share on other sites More sharing options...
Dh42 Posted June 27, 2017 Share Posted June 27, 2017 If by rule the pool you mean the rendering of pages takes more than 3 times longer, yes, it rules the pool. The bottom of the pool. I posted facts backed up by numbers. Numbers from the largest search engine in the world. The 74 and 78 that the new theme gets is not promising here either, https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Ffo.demo.prestashop.com%2Fen%2F&tab=desktop Still pulls that render blocking error. Link to comment Share on other sites More sharing options...
j.kaspar Posted June 27, 2017 Author Share Posted June 27, 2017 I understand, what you want to say, but just want to note, that pagespeed result of the demo, cannot be really taken into account. It clearly lacks the basic settings... Link to comment Share on other sites More sharing options...
Dh42 Posted June 27, 2017 Share Posted June 27, 2017 I cannot be responsible that they set up a bad demo, or that I cannot get the latest version to install where I can set up a demo. Link to comment Share on other sites More sharing options...
j.kaspar Posted June 27, 2017 Author Share Posted June 27, 2017 Indeed.. You cannot. I didn't even imply otherwise. It would be interesting to see a benchmark of a real,well configured 1.7 in production environment. Link to comment Share on other sites More sharing options...
Dh42 Posted June 27, 2017 Share Posted June 27, 2017 It would. But 1.7 theme has been stripped down of almost all useful features so they can be sold back to the community. It is hard to test against something that has more features and something that is the bare minimum. 1 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now