Jump to content
thirty bees forum

the.rampage.rado

Silver member
  • Posts

    912
  • Joined

  • Last visited

  • Days Won

    62

Posts posted by the.rampage.rado

  1. The name should be possible to be pulled and displayed in the theme as it's saved now.

    Another good enhancement would be the ability to 'group colors'.

    Something like:

    Group: Blue - sky blue, royal, brand1_blue, brand2_blue, some_other_blue., Yellow - sun_yellow, very_hot_lava_yellow, neon_yellow, solar_shine, White - cream, very_white, Green - light green, leaf green, dark green, etc.

    And then have those groups displayed in the layered navigation module for selection.
    That way the customer can search for all 'blue', 'red' or 'yellow' products despite that the majority of them use 'brand colors'.

    The two current workarounds are to have every product in generic color name and # or the layered filter to have hundreds of options showing only 2-3 products each.

    Of course this means a general rework of core, modules and theme.

  2. This is a short guide for those that are running their thirtybees on LiteSpeed servers. I'm running my shops on shared hosting.

    1. First of all of course you have to have Litespeed server and paid license for the cache and enabled crawler (on the server side). If something is not working please contact your host and check if the crawler is enabled, etc.

    2. Install the free Litespeed cache module. With it you will be able to configure various settings. Current version is 1.4.1 and it should work with thirtybees 1.5.1 with the following settings:
    image.thumb.png.b3801130bbd636b18bdd3fda03ae627c.png

    (The module is written for PS of course and not as well supported as the one for Magento or WP but it is working fine as of now for thirtybees.)

    At this moment you can test if the LScache is working - open your FO and open certain page. If you load this page for first time the header for the generated html code should show "miss":

    image.png.884b453070ef080823e83cdf81aa111a.png

    After a page reload this should show hit:
    image.png.ceabbd311f6eb3f6887d75017978dba3.png


    (How to check this: right click anywhere, click 'Inspect', go to 'Network', on the left tab scroll to the top and look for your generated page, click it and on the right side under Headers tab there should be lot of information, near the end of the first section you should find 'X-Litespeed-Cache:' which should be 'miss' or 'hit'. If you loaded the current page for the first time it should be showing 'miss', on reload it should show 'hit' and this means that your LSCache is configured and working properly)

    3. Download the Litespeed LSCache crawler script. Place it in your website root (next to index.php). Some documentation for the plugin and the crawler can be found here . Make it executable with 0711 permissions. image.thumb.png.b3801130bbd636b18bdd3fda03ae627c.png

    4. You have to have your thirtybees Sitemap module installed and running. If you don't know how to configure it, just as in a comment and somebody will hop in to help, but the module is pretty self-explanatory. Generate your sitemap and leave this page open as you will need to copy the sitemap link.

    5. As I said I'm using shared hosting so I go in my cPanel -> Cron jobs and make new entry with the following settings:
     

    public_html/cachecrawler.sh https://www.myshop.com/1_en_0_sitemap.xml

    and just for testing if everything is working you can run this job every 5 minutes (my shop with ~250 products takes ~120 seconds to crawl every page with default crawl interval).

    (Please note that bash should be installed and running to use this script.)

    6. If you configured the cron to run every 5 minutes wait 2-3 more minutes and visit your shop again. This time open the Develpoer tools in advance and load any link that you know was not visited recently. If the crawler did its job there you should see direct 'hit'.

    7. In order to troubleshoot the crawler it is recommended to take a look at the email that is sent after the cronjob is done. You can find information how to configure this online on many places. In this email you can see which pages are cached already so are skipped, which are caching, etc.

    8. It is recommended to edit your cronjob and set it to run every 12 or 24 hours.

    And that's all, enjoy your faster first page loads!

     

    Troubleshooting (in addition to the one in the LSCache documentation page):

    1. I'm using Warehouse theme and the module comes with preconfigured profile for it. If your cart module or any other acts funny you can play around with ESI hole punching - more info in the documentation.

    2. If you are using Blackhole for bots module from DataKick please keep in mind that the server running the cronjob could end up in the blacklist if the Sitemap settings are not correct. And blacklisted folder should be excluded from the sitemap (I had problems with /modules/, exclude everything you don't need in the sitemap). If you are locked out you can delete the latest IP from the blackholeforbots table, regenerate the sitemap and test again.

    3. If you're using URLs with accented characters (or Cyrillic letters as I'm) you have to use -e switch like so:

    public_html/cachecrawler.sh https://www.myshop.com/1_en_0_sitemap.xml -e

    Otherwise the crawler will ignore every accented character and will only crawl the first couple of pages of your sitemap.

    4. If not every page is crawled because some of them are forbidden or not available the crawler script will stop after 15 such pages. In order to override this you have to edit it. Look for PROTECTOR='ON' in the very beginning and turn it OFF. In general this will not be needed but for troubleshooting together with Blackhole for Bots module it could be needed.

    5. In general don't have pages in your sitemap that are blocked by robots.txt when you are using Blackhole for Bots module.

    6. If you are using it as me and you are using Multistore be extra careful if you have removed all module pages from your sitemap (leaving reviews, etc front facing pages) PER SHOP as All Shops context currently does not affect the Per shop settings and you can end up with edge-case scenario with two different versions of your sitemaps depending if you manually generate them or leave the cron to do so. So as of today - make all Per shop settings in Sitemap comply with robots.txt too.

  3. 7 hours ago, DRMasterChief said:

    from each font i have (locally)  eot / eot?#iefix / woff2 / woff / ttf / svg

    e.g. like this:   font-family: 'Open Sans';
      font-style: normal;
      font-weight: 400;
      src: url('../fonts/open-sans-v15.....

    and maybe i have to add the  v40  ????

    found this: https://stackoverflow.com/questions/22295165/googles-open-sans-regular-400-always-italic 

     

    I have done a search and the loaded files from my question are in modules \authorizeaim   and in \nocaptcharecaptcha  and in  admin\filemanager

    From what I can find online (and is not posted in 2012) and what I get from it is that woff2 is the only format we need currently as all modern browsers (mobile and desktop) support it. So I only load it locally.

    A nice read: https://stackoverflow.com/questions/75868078/what-are-the-possible-values-for-format-in-a-font-face-src
    A nice tool to download only the needed fonts and generate the code for them: https://gwfh.mranftl.com/ 


    Regarding the font version - I think if you load only one version and stick to it it's OK, no need to load 2 or more versions as this might confuse (probably) different browsers to render your page differently if they fall back to one or another version and there are differences between them. (Let's say for Montserrat that I use, they have weight changes between versions).

    Also, if the tool for generating the code you use does not add it do so manually:

    font-display: swap; /* Check https://developer.mozilla.org/en-US/docs/Web/CSS/@font-face/font-display for other options. */



    A statistic: https://caniuse.com/woff2

    • Like 1
  4. Thank you!

    "Hide this status in all customer orders." is something that will work as a workaround I think. Not many orders I have that go to Canceled - only those that are fraud, duplicated or test orders from me. So those that are fraud will show the last state before Cancelled, which is 100% working and the only issue is duplicated orders but those customers are contacted prior to this cancelation so they will not get confused if they see the wrong status couple days after that.

  5. I would like to make 'a sleeper status' for all fraud orders. And I want to name it with some obscure name like 'Pending' or something like that so the scammers don't see 'Canceled' in their profiles.

    But I want for this status to be able to restock all products in my shop and I don't want any further emails to be sent to them.

    image.png.207c55f4f2afcb3a1f80ea57a2bf58f9.png

    (this is Canceled options).

    I'm unable to see which flag here makes the restock action.

  6. On 3/23/2024 at 3:42 AM, the.rampage.rado said:

    I have plenty of those requests "//2019/wp-includes/wlwmanifest.xml" what should be the formatting in the first part of the redirect?

    To update on a fix I found and worked for me.

    All of those requests starting with // contain wp-includes, wp-admin or wp-content so this solves the issue:

     

    RedirectMatch 301 wp-includes /modules/blackholebots/blackhole/
    RedirectMatch 301 wp-admin /modules/blackholebots/blackhole/
    RedirectMatch 301 wp-content /modules/blackholebots/blackhole/

     

    • Like 1
  7. As you probably know this is a security topic since couple of years.

    Be default thirtybees (and PS in that matter) does not come with default policy and on this test a vanilla installation fails with F.


    One way to implement some of the most important headers is to add this code to the beginning of your htaccess file:

     

    <IfModule mod_headers.c>
       Header set Content-Security-Policy "default-src 'unsafe-inline' 'unsafe-eval' 'self' *.googleapis.com *.gstatic.com *.cloudflare.com *.googletagmanager.com *.google-analytics.com *.youtube.com *.google.com;"
       Header set X-XSS-Protection "1; mode=block"
       Header always append X-Frame-Options SAMEORIGIN
       Header set X-Content-Type-Options nosniff
       Header set Strict-Transport-Security: max-age=63072000; includeSubDomains; preload
       Header set Referrer-Policy: strict-origin-when-cross-origin
       Header unset X-Powered-By
       Header always unset X-Powered-By
    </IfModule>


    This code is far from perfect, unsafe-inline and unsafe-eval should be used very carefully but should we remove them completely if we're not sure if we have embedded js. This code gives A on this test but keep in mind that on the first row you should add all external resources you use and on each row the settings should be customised to your needs. This is what appears to work for me.

    Another (nicer) solution is this free module from nenes25 which adds further options for logging, debugging and testing in BO. Unfortunately the support for PS1.6.1.x that is advertised in his blog does not translate in the latest 0.4 version and I'm unable to install it on my test setup. 

     

    What are you using to fix this?

  8. I stumbled upon an easy fix that is recommended a lot online for this notice in Google Page Speed and I would like to ask the members with coding if it works and should we really care about this notice?

    https://github.com/zzarcon/default-passive-events

    I'm using it by adding this code to Custom code of my thirtybees and when checking with the tool this notice is gone. There are comments online that this file should be loaded before everything else .js and not on the end of the page as we move them with thirtybees. I'm unable to observe any scrolling improvement (or I'm not looking where I should)... Of course one less line in the Google report means the report gathers 1 point overall, not much but in the end if this works it's a five minute job.

     

    <script type="text/javascript" src="https://unpkg.com/default-passive-events"></script>



    If this works could we implement this tiny .js file locally to avoid using one more connection to external CDN?

×
×
  • Create New...