Jump to content
thirty bees forum

datakick

Administrators
  • Posts

    3,106
  • Joined

  • Last visited

  • Days Won

    479

Everything posted by datakick

  1. I've checked the SSL certificate for your mail server: openssl s_client -connect mail.jv80.se:587 -starttls smtp -crlf outputs Connecting to 156.67.80.139 CONNECTED(00000005) depth=2 C=US, O=Internet Security Research Group, CN=ISRG Root X1 verify return:1 depth=1 C=US, O=Let's Encrypt, CN=R11 verify return:1 depth=0 CN=mail.mxmail.pro verify return:1 --- Certificate chain 0 s:CN=mail.mxmail.pro i:C=US, O=Let's Encrypt, CN=R11 a:PKEY: rsaEncryption, 2048 (bit); sigalg: RSA-SHA256 v:NotBefore: Oct 7 10:04:49 2024 GMT; NotAfter: Jan 5 10:04:48 2025 GMT 1 s:C=US, O=Let's Encrypt, CN=R11 i:C=US, O=Internet Security Research Group, CN=ISRG Root X1 a:PKEY: rsaEncryption, 2048 (bit); sigalg: RSA-SHA256 v:NotBefore: Mar 13 00:00:00 2024 GMT; NotAfter: Mar 12 23:59:59 2027 GMT --- Server certificate -----BEGIN CERTIFICATE----- MIIFUjCCBDqgAwIBAgISA3dwbr6Y61zugeXx1GxKELtmMA0GCSqGSIb3DQEBCwUA MDMxCzAJBgNVBAYTAlVTMRYwFAYDVQQKEw1MZXQncyBFbmNyeXB0MQwwCgYDVQQD EwNSMTEwHhcNMjQxMDA3MTAwNDQ5WhcNMjUwMTA1MTAwNDQ4WjAaMRgwFgYDVQQD Ew9tYWlsLm14bWFpbC5wcm8wggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIB AQCSQ1HbHGiBbou7GOZhL0jYk2D3IK3Al48pX/OioRQJL57c0HFCFRGrJgJ523qQ gt9yHwmeSjr+JdsAedOw0evb2rKf3CaKfW7ECMkW0cUvM8yhOs2LyC8o+DLhhFGQ gh1VsfOetKN05zM11vLfqWpuRsLa7nqJTE1ZIxYLpe1pG1zVY2N36FqVdw06ptOw UxTxDzhdi5BbAsdjC8rVweo0Ja0pTUb9F+nmQV5F1U0g/eLsyjzQvyhFVhJdc1sH 8YlDTw9NnSPm84GUlT/Gxzo3u7tMPYRh4KSE6i+uYUm21phRDZeUzzzYFGY4nfX1 SoP/9Qqjg51T2xuv0Dgg5MpLAgMBAAGjggJ3MIICczAOBgNVHQ8BAf8EBAMCBaAw HQYDVR0lBBYwFAYIKwYBBQUHAwEGCCsGAQUFBwMCMAwGA1UdEwEB/wQCMAAwHQYD VR0OBBYEFDk9g01mxaxjoVTkhDGc59225HJQMB8GA1UdIwQYMBaAFMXPRqTq9MPA emyVxC2wXpIvJuO5MFcGCCsGAQUFBwEBBEswSTAiBggrBgEFBQcwAYYWaHR0cDov L3IxMS5vLmxlbmNyLm9yZzAjBggrBgEFBQcwAoYXaHR0cDovL3IxMS5pLmxlbmNy Lm9yZy8wfQYDVR0RBHYwdIIMbWFpbC5qdjgwLnNlghxtYWlsLmt0aW1hdGhlb3Bo YW5vdXMuY29tLmN5ghJtYWlsLm1lZGlhc2FmZS5wcm+CD21haWwubXhtYWlsLnBy b4IObWFpbC5teG1haWwuc2WCEW1haWwucGlzc291cmkub3JnMBMGA1UdIAQMMAow CAYGZ4EMAQIBMIIBBQYKKwYBBAHWeQIEAgSB9gSB8wDxAHcAzxFW7tUufK/zh1vZ aS6b6RpxZ0qwF+ysAdJbd87MOwgAAAGSZqXgAAAABAMASDBGAiEAiQZplHsW+AXR C5g1d1yuPRiPiIGACuOZn8ZBgQPB7z0CIQCwTvKO+VMaeOq8rRXaNiLdiqKlz7lk RH704XdJjJWIAgB2AD8XS0/XIkdYlB1lHIS+DRLtkDd/H4Vq68G/KIXs+GRuAAAB kmal54wAAAQDAEcwRQIgZqd3CmlCk+h6p8HfSW+SzmlfgwyENhHl4JbqdPZvKboC IQDJ762uDxba1ZT2GibDQn87EO/TVJaQh2uol0i9FG+NpjANBgkqhkiG9w0BAQsF AAOCAQEAVHukoNoGdJwB6urbDbq0tzCoK1RfdQK/IjZoiGPK6IiQS6SQH8tG8g+X HhFfsnSdpPLK4UHB/e1KnGD0YuHXrYhBSsF4wSsq4bwNp6o+123P8fIblEVZStZG Wyfhj/mpmpN86LPs7sJRSrZREmU2txdSx0F930AgDrPZ3sdTYuEs4SQnyymdRcbo P+iERwxCnOX5SFuEEYWW75WSOWGIY34L8py+mFLdy+C/l4rv/yXNLOT9HuT+FbP5 1/VewuSEp/gCDTxQT9PqgwGDuX7KWcp77iho6zqgNyPyW1SU3qhvfpg0AeT1XHU5 iAcmR7M8XMqDOpv+4p5XhY//5gREqQ== -----END CERTIFICATE----- subject=CN=mail.mxmail.pro issuer=C=US, O=Let's Encrypt, CN=R11 --- No client certificate CA names sent Peer signing digest: SHA256 Peer signature type: RSA-PSS Server Temp Key: X25519, 253 bits --- SSL handshake has read 3408 bytes and written 433 bytes Verification: OK --- New, TLSv1.3, Cipher is TLS_AES_256_GCM_SHA384 Server public key is 2048 bit This TLS version forbids renegotiation. Compression: NONE Expansion: NONE No ALPN negotiated Early data was not sent Verify return code: 0 (ok) --- 250 DSN DONE The SSL certificate was issued primarily for domain mail.mxmail.pro and not for mail.jv80.se However, when you decode certificate (for example by https://www.sslshopper.com/certificate-decoder.html), you will see that the certificate CAN be used by mail.jv80.se, because this domain is listed in Subject Alternative Names section. So, the SSL handshake should be successful. Hower, it looks like php native SSL method does check SAN when verifying peer name -- it expects peer CN to match the requested hostname, and does not check SAN list as well. Fortunately, it might be possible to force PHP to accept SAN if your provide the peer name in SSL options, as described here: https://github.com/PHPMailer/PHPMailer/issues/1113 You can test this by editing file /modules/tbphpmailer/src/PhpMailerTransport.php and insert this code. $message->SMTPOptions = [ 'ssl' => [ 'peer_name' => 'mail.jv80.se' ] ]; Result should look something like this: Let us know if this helped.
  2. That is not an application issue, but server issue. Check this: https://github.com/PHPMailer/PHPMailer/wiki/Troubleshooting#certificate-verification-failure Most likely, your hosting providet is redirection all smtp traffic to their own smtp server.
  3. Yes, that's the one. Deactivate this It would work nicely on websites that follow REST principles, and don't have side effects on GET requests. But on sites that perform side effects on GET requests, it's very dangerous. It's stupid that in thirty bees you can 'delete' a record by simply writing url into url address tab and hit enter.
  4. Hi everyone, I just wanted to raise your attention to the fact that cloudflare recently enabled Speculation Rules API for all plans. This functionality is designed to improve browsing speed by aggressively prefetching potential future assets/pages etc. However, these prefetch requests can be quite dangerous sometimes. Example: I'm in back office modules pages, and I click on 'Uninstall module' button. The confirmation dialog is displayed to ask if I'm sure. But the question is irrelevant - because of this new prefetch functionality, your browser already sent a request to your server to prefetch the response for the uninstall action url. The module is already uninstalled You can see the request in network tab. Even though claudflare responded with 503 error code (meaning the prefetch response will be ignored by browser), the request still made it to your server, and action was executed. If you click 'OK', thirty bees will send the actual request, and it will actually fails with error message "This module has already been uninstalled" That's nice, isn't it. The uninstall/install module buttons are not the only one that are impacted. For example, 'delete' or 'approve' or 'send' buttons in lists in back office, etc... It can also impact front office, most likely -- browser can automatically add product to a cart because it believes that user will click on the "Add to cart" button soon, so better be prepared for that... right. It's quite dangerous optimization. It's true that if thirty bees used POST instead of GET requests to implement these kind of actions, this situation could not happen. But we can't really change that. We will look into a way to prevent/mitigate this problem. Fortunately, browser is sending some HTTP headers that we can use to determine if request is a regular or prefetch, so we can use that to prevent this (hopefully) But until a fix is implemented, I advice you to disable this new prefetch optimization in your cloudflare dashboard. And to be sure, maybe even after that 🙂
  5. There is no change. Neither thirty bees, nor prestashop, sends email notifications about new orders to merchants by default. This functionality is provided by mailalert module.
  6. No, it will not
  7. I'm not sure, I didn't test. The upcoming module is a backport of mollie module for ps17, so hopefully it will be supported.
  8. Also, you should use tool like https://technicalseo.com/tools/robots-txt/ to check that it works properly
  9. The other disallow directive is for stores without friendly urls enabled - there is no change in blackhole name there The robots.txt should look like this: User-agent: * Disallow: */blackholenew/ Disallow: /modules/blackholebots/blackhole/* Note the * before /blackholenew/ -- it's to block language variants as well
  10. I've released new version of the module that will allow you to change the trap URL. If bing is already indexing your trap url for any reason, you can change it from https://domain.com/blackhole to something new like https://domain.com/my-honey-trap. (and change robots.txt accordingly) This way, when bing sends a traffic to your website to /blackhole address, it will not be blocked. To prevent 404, I suggest you add redirect from /blackhole to homepage into your .htaccess file as well. Hopefully, bing will not add the new trap url to the index again. I've added some extra precaution to prevent this as well -- if the known good bot (google, bing, etc) somehow make it to your trap url (even when the robots.txt blocks it), then the content of the trap page will be mostly empty, and page headers will contain <meta name="robots" content="noindex"> that will instruct bot to not index this page.
  11. Did you add that entry into the robots.txt from the very beginning - at the same time you installed the blackholebots module? If you added this later, then bing might have already indexed the page.
  12. You should ask duckduckgo this question, not me. If the robots.txt explicitly blocks the url, there shouldn't be any reason for them to index it.
  13. We have custom sendylists module to synchronize thirty bees with lists in sendy. However, we had to modify the sendy installation itself, because it doesn't support bulk api call by default. So this module is not useful without that modification. That's one of the reasons why we didn't release it publicly.
  14. Thanks. 1) We know about the the first set of warnings -- this is issue with legacy code, that stored transient information into objects during import. It will take a lot of refactoring to fix this, unfortunately. Currently it's not high on priority list, as it's still just a deprecation warning. This will be issue on php9, tough 2) Tools::getDateFromDateFormat - thanks, fixed in bleeding edge 3) consistency check module -- I haven't updated the module to check for all supported extensions -- do not use this module on bleeding edge yet
  15. Hi, use core updater module to check (and fix) database differences. Your update probably failed before db was migrated properly.
  16. Should be fixed in bleeding edge, please try
  17. It's a bug in core, I've managed to reproduced on my dev env. I'll prepare a fix and let you know.
  18. I very much agree with @the.rampage.rado From security point of view, it's much better to install thirty bees into standalone subdomain. I have seen thirty bees sites that were infected because of wordpress installation. Using subdomains has some additional benefits as well. For example, less cookies will be sent -- thirty bees will not receive wordpress cookies. Or misconfiguration in .htaccess within parent (wordpress) directory will not affect thirty bees web. And I'm sure there are more. Install in in standalone, isolated, subdomain.
  19. As Smile wrote -- adding new field to db, and implement basic CRUD operation on top of it, is quite simple. Anyone who really need it can can implement this functionality. Hoverer, to add this properly to the core, is more complex. We have to considerer other aspects as well, for example: support this field in CSV import expose this field in Webservice, both read and write access impact on themes -- should this be exposed to theme? Do we need to update community-managed themes? since this is a classification field, maybe we should have a new table with all possible values instead of free-text? If that's the case, we need to populate this table during install AND during store update, and make sure the table values are up-to-date (which means future work is needed) and who knows what else It would take a few hours, maybe even days, to do that. It was not worth the effort at the time - no real demand for this field, and if anyone really needs they can create a limited implementation themselves.
  20. Yes, you should install collectlogs module, and let it collect errors for a few days. You have to fix all deprecation warnings - core code should be ok, but you will get some errors from modules and theme. After all warnings are fixed, you can safely update to higher version of PHP. Always update to next PHP level only - from 7.4 to 8.0. When you later want to update to 8.1, you will have to repeat this process again.
  21. You have to set permissions for this webservice key in Advanced Parameters > Webservice
  22. Your hosting provider suspended your account for some reason. Maybe you didn't pay, or maybe your site was sending spam emails,... Contact your hosting provider
  23. When I tested it yesterday it indeed displayed the robots.txt error. I assume that google used, at that time, some old (cached) version of robots.txt that blocked the access.
  24. My guess is that your server is blocking requests from google servers (maybe you have some firewall or something that block it) This is not an application issue. You can verify this by checking your server access log
  25. datakick

    Error 500

    It was not possible to load csv/excel file for some reason. Maybe incorrect file permissions, maybe missing php library, ... Please contact module author for assistance
×
×
  • Create New...