Jump to content
thirty bees forum

Traumflug

Members
  • Posts

    1,665
  • Joined

  • Last visited

  • Days Won

    82

Posts posted by Traumflug

  1. To add a technical datapoint: page caching of any kind can only cache the HTML part of the page, the initial page load. That's where all the PHP voodoo gets executed. All the stuff after that initial page load, CSS, JS, images, whatever, isn't subject to page caching, no matter how it's configured. And one can easily measure how long this initial page load takes.

    Instrument #1: use Curl on the command line. Like:

    time curl https://librairiezbookstore.com/
    

    Instrument #2: browser's developer console. Open it, click on the 'Networking' tab, then reload the page. This generates a list of all the resources loaded, look at the very first one. It should report loading time.

    With both instruments I get between 540 milliseconds and 590 Milliseconds on librairiezbookstore.com. That's what one can gain. Very good setups take like 350 milliseconds, so even the best imaginable page caching could never improve page loading performance by more than 0.2 seconds. If one sees more improvements for the complete page load, it's the result of content no longer displaying.

    • Like 1
  2. 18 hours ago, movieseals said:

    What is obvious for you guys might not be for people who are just selling things and not technicians.

    Why don't you just believe technicians, then. Quite a number of merchants search for a silver bullet despite it was said many times such a bullet doesn't exist. The guidance you ask for is very clear: don't use full page cache.

    If you want faster page loading times, reduce modules and content. Uninstall modules, don't just disable them. thirty bees is very fast by default and can be customized heavily with keeping this performance. Good example is @wakabayashi's technically brilliant https://www.chesspoint.ch/

     

    • Like 1
  3. 18 hours ago, movieseals said:

    So far everything seems to work as intended but if I understand you correctly, it shouldn't?

    It can't. One cannot cache dynamic content.

    As people refuse to believe this, Full Page Cache was introduced, which is kind of a simplified template engine in parallel to the already existing one and moves responsibility for correctness from the engine to the user.

    • Thanks 1
  4. Looking up one of the major players to get a basic idea about pricing is always a good idea, even when they don't necessarily have the best offering.

    https://www.hetzner.de/webhosting

    https://www.hetzner.de/managed-server

    As one can see, prices largely depend on the expected traffic. Hosting for € 2.-/month should do for a hobbyist site where page loading time doesn't matter much and one rarely expects more than one visitor at a time. The more visitors one expects and the more competitive response times should be, the more expensive it gets.

    Not an easy decision 🙂 Just one recommendation: if you have the choice, stay away from storage on magnetic disks (often named 'HDD'). Unless one wants to store terabytes of data, SATA SSD or, even better, NVMe SSD is state of the art.

  5. Es gibt sicherlich Module, die Postleitzahlen adhoc überprüfen und ggf. ablehnen. Speichert man die Adresse, rufen die eine Datenbank mit allen existierenden Postleitzahlen auf, um die Postleitzahl auf ihre Echtheit zu prüfen.

    Keines der thirty-bees-Module macht das, doch vielleicht findet sich ja ein anderes Modul, das da in Frage kommen könnte. "Adressüberprüfung" oder sowas ähnliches. Das dann mal abschalten und nochmal probieren.

  6. 14 hours ago, Sigi said:

    Was spricht dagegen?

    Viele Änderungen könnte man automatisch machen. Zum Beispiel die Anzahl der erlaubten Buchstaben in einem Datenbank-Feld erhöhen. Andere Änderungen können zu Datenverlust führen. Zum Beispiel die Verringerung der Anzahl der erlaubten Buchstaben.

    Das Modul kann zwischen diesen beiden Arten (noch) nicht unterscheiden.

    • Like 1
    • Thanks 1
  7. Off the top of my head, these steps are required to duplicate a module:

    1. Duplicate the module folder.
    2. Give the new folder a meaningful name.
    3. Inside that folder is the main PHP file, same name as the folder, plus .php suffix. Rename this to the new folder name.
    4. This main PHP file defines a class making up the module. In its constructor, a variable/property '$this->name', change that to the new name as well.
    5. Delete all files starting with 'config' and ending with '.php', they'll get recreated automatically.

    That said, duplication makes no sense for some modules. For example, modules defining a database table; the module copy would use the same database or require more substantial coding efforts.

  8. Chances are good one imports this trouble by importing the old database. Starting from scratch with an old database doesn't make much sense.

    To clean an existing installation:

    • Uninstall and delete all non-thirty-bees-modules,
    • delete all overrides, which means, all files ending in .php in overrides/, except index.php,
    • Use Core Updater to update the installation to have no obsolete and no modified files.

    To move from one domain to another, move/copy all files and move/copy the database. This should give a working back office. To move front office as well, adjust the domain URL in Preferences -> SEO & URLs, third panel from the top.

     

    • Like 3
  9. On 3/10/2020 at 9:56 AM, grestart said:

    After 2-5 seconds i get this error: "Error during download". I attach printcreen.

    This printscreen also shows a way to work around the problem. Download these two ZIP files (first line on the printscreen) with some other means and place them where psonesixmigrator wants them (third line from the bottom). This might well give a better idea on why the download fails.

    Having done these downloads manually, psonesixmigrator should pick these already downloaded ZIPs up and finish the migration.

  10. Unless you're prepared to learn Git command line stuff, the best bet is to use Core Updater, the Bleeding Edge versions. Core Updater follows the repository on Github closely and gets updated automatically; each commit to the Github repository arrives there a couple of minutes later. It's designed behave this way and get used like this.

    This also means: version 1.1.x there as of today isn't the same as 1.1.x last week or next week.

    • Like 1
    • Thanks 1
×
×
  • Create New...