From what I understand tb has integrated caching support. However when running a test on the Front office demo with Google Page Speed insights the server response time is 1.1 seconds. My 1.5 installation on a shared hosting is 4-5 times faster than that and when testing a caching module in the past I had around 150 msec response time on a standard shared hosting.
Does this mean that caching is not enabled by default or still in development?
My 22.214.171.124 theme uses bootstrap and it is responsive.
Don’t you think 1.1 sec is a very long load time?
It does seem like it. When you disable all third party modules does it help the loading time? You could have a module that is poorly written.
Sorry, maybe I was not super clear.
I am just putting your front office demo through Google Page Speed. I have not installed it on my hosting or anything. I just compared the metrics with the demo here to what I have today and the demo server response time was significantly slower that my current installation that is currently not using any external caching plugin. However as I mentioned, if I enable the ExpressCache module, my shop has a server response time of around 100 msec while the TB demo is at over 1 sec and you previously indicated that you are using a caching module that should be faster than Express Cache.
So to summarize:
- My shop on Prestashop 126.96.36.199, no caching module. Server normally responds in around 300->700 msec and I have a lot more things on my page than the TB demo. I am running on a cheap shared hosting.
- My shop using the Express cache plugin has a server response time of less than 100 msec (I had some issues with the module so I can not run it live yet).
- https://front.thirtybees.com/ shows server response times around 1 sec and the index page is very lightweight compared to many Live shops.
I am just pointing this out since speed would be one of my main incentives to migrate and I was having high hopes for an embedded caching module that works out of the box and could produce as impressive results as for instance ExpressCache.
Happy to help with anything I can contribute with in terms of testing or feedback.
Honestly the page speed metric is about to be done away with because of the problems in it actually being real world useful, so we don’t target it. It also provides false results on load time because of how google does things. If you have site hosted in the US, but a EU visitor runs page speed on the site, it loads the page speed from EU. Which I think we all know will show slower because of data latency.
One test I do prefer is pingdom, which shows the total load time of the site. https://tools.pingdom.com/#!/ehSFr2/https://front.thirtybees.com/ Which is well under a second.
What Google is actually transitioning to is Lighthouse. If your Chrome has updated already, it is in Chrome 60 now. It doesn’t give the blanket recommendations that pagespeed does, it gives a more meaningful metric that shows how the user experience is. That is why Google is migrating away from pagespeed to Lighthouse.
See Pagespeed just looks at things with a very broad stroke and does not take into account how the actual user experience is. A good example is the render blocking CSS and JS. It simply defines that metric based off the size of your files, nothing more. CSS loadings in a cascading way. So your first css selectors could paint your page before the file has fully loaded. That would be a meaningful paint. This is when a mobile user clicks a link to your site and can actually see your site on their screen rendered.
If you run thirty bees through lighthouse you will see different results. Like this, https://www.screencast.com/t/DD6r7NixEqY If you run the slimmed down 1.7 PrestaShop demo through Lighthouse you should see results like this, https://www.screencast.com/t/cXt6004lg6Qz To be fair, here is a shopify demo as well. https://www.screencast.com/t/ZLlo4YEV5
See now you are not relying on metrics that could have no bearing on the actual user. You are relying on the metric that will decide if users press the back button or not.
Thanks for the detailed feedback. I was not familiar with Lighthouse and indeed it shows the numbers in a different way. I will spend some time looking at the results to see what I can learn.
Thanks for all your efforts on TB, I do look forward to migrating.
I actually took the time to write a blog post about this if anyone wants to read it, https://thirtybees.com/blog/pagespeed-is-dead/
Guest last edited by Guest
@lesley it’s a very nice blog post though maybe you should have done the Thirty Bees from Dallas and not Sweden (or San Jose as it shows) for a fair comparison.
I’ve always relied on Pingdom and GTMetrix, I stopped using pagespeed ages ago. I prefer Pingdom as I can test from Sweden, which is closest to our customer base as GTMetrix uses Canada as a base. This is because we don’t use a CDN so I know I’m going to get slower results from GTMetrix. Given our customer base is the UK we picked a hosting company closest to the bulk of our customers.
The best thing I like on both these tools though is the actual waterfall of everything waiting/loading as it clearly shows for me how crap my OPC is as it uses two rand and an ajax call that take ages to process!
I’ve not tried lighthouse before. I found the performance varied a lot with 49% on first run and 92% on a second run and 80% on a third run. The accessibility and best practices both remained the same throughout each test. It is nice though to have another tool to add to mix for testing.
Thanks Lesley, very informative article.
When I have a little time I will setup a TB clone on my server so I can do a fair compare using Lighthouse under the same conditions.
@DavidP Another thing you have to realize, tools like Pingdom and GT Metrix do not take into account the time it takes to render a page. If you have complicated JS that slows down the rendering of the page you will never see it with those tools, then you will be left wondering about the high bounce rate on your site.
Guest last edited by
@lesley good point. The GTMetrix does give you a fully loaded time but like you say not rendered, which is a great feature on that lighthouse. I think using that in combination with the other tools gives you a much clearer picture of what’s going on.
The jpresta page caching module I use helps in the rendering case in that it creates a ‘rendered’ page that’s served up first to users rather than the page being built from scratch for each user and it does have a visibly noticeable speed increase. The only down side is I’ve got to run a script to visit every url if I change any code or css to re-create the pages. I’m assuming the Thirty Bees page caching will be on par with this module.
Modules like that are great, that is why we added that in the core. Yeah, our page caching is very similar to the module, likely we have a little bit better performance as well. Clearing the caches is the downside of using a full page cache, but it works in the long run if you do not make too many edits in production.
So, in order to completely close my initial question.
Lesley, what you are saying is that functionality similar to ExpressCache or jpresta modules is an integrated part of ThirtyBees. Just like @DavidP my experience from ExpressCache is that it provides a notable improvement in user experience because of the pre-caching so I am really interested in testing that with TB as well.
My initial question actually come from the 1 sec response time on the demo site which surprised me if this kind of caching was enabled but perhaps it is because I run my test from Europe and your page is in the US and Google uses a European test server or something.
I will let you know once I have thirty bees installed.
@hubbobubbo Correct, we have the same caching feature built into the core. The only thing we do not have yet is a cache warmer like they do.
I think the reason for the 1 second wait is because of the location of the Google test machine. Seattle is a long way away from the EU.
Excellent. Looking forward to testing more.
Maybe cron job support for the cache warmup/refresh could make it into the future roadmap
Yeah, it could be something we look into. I have wondered if it is needed because with the optimizations running a site without the cache is fast as well.