-
Posts
821 -
Joined
-
Last visited
-
Days Won
5
Content Type
Profiles
Forums
Gallery
Downloads
Articles
Store
Blogs
Everything posted by dynambee
-
@mdekker said in I made the switch to Thirty Bees!: Nice, thanks for all the posts @SLiCK_303 Didn't know all these improvements were possible. Now enjoying them in my own store :) Improvements, you say? Coming in 1.0.4? :)
-
Loads quickly even from over here in Japan!
-
I'll definitely be using this! I doubt most people give a real DOB and it's just something that annoys a lot of customers.
-
@mdekker said in Updating products from Filemaker ?: Did I read a bug report in your long post? Yes, the PS API doesn't support sending NULL values, as far as I am able to figure out. In many cases this isn't a problem because if you send nothing then the default value is used and the default value is NULL. However in at least one case (detailed in my PS bug report that I linked to) it is required to send a value but NULL can not be sent because the API doesn't support sending NULL. Well...you can send NULL, but it will either cause the call to fail or will be interpreted as a zero. This results in having to manually update the db table to set those zero values to NULL, either with a MySQL trigger or by directly writing to the table after using the API. There is also the second issue I linked to where the API does not seem to provide a way to set Carrier Group Access. As such any carrier created by the API can not be used until this is manually set -- or you can do it by writing to the db table, or with a trigger. The PS API works fairly well but isn't quite complete and isn't very well documented. It also doesn't have a way to do bulk updates. Being able to send a file through the API and have a high speed bulk update happen would be a great addition. I'm hesitant to request too many changes to the 30bz API though as so far I have been able to reuse all the code I developed for PS without making any changes to it...
-
Should add that if you are asked for authentication by the API then the username is your webservice key and the password is blank. This should only happen with the https://www.yourdomain.com/api URL as the other format has the key included already in the URL.
-
The API is just sending & receiving data as XML. You can see the required XML formats directly on your own server, and they are talked about in the PS 1.6 documentation as well. For example, for products: https://www.yourdomain.com/api/products?wskey=putyourwebserviceapikeyhere&schema=synopsis Obviously make sure you have SSL working first so you aren't passing your webservice key over plaintext. You can see a full list of the different API calls with this: https://www.yourdomain.com/api Substitute call names into the previous URL (replace "products" with the new call name) and you will get the XML file layout information. The PS webservice is not perfect and there are some problems, in my experience, with NULL values. The webservice does not handle them properly and some calls require them. This is hit and miss and you will have to look at the sample products/customers/addresses/shops/etc to see where a NULL is required but a 0 has been inserted. The easiest way to fix this will be by using triggers on the MySQL server that run whenever a new record is inserted into these tables. The trigger can update the new records and put in the NULLs. It's a PITA but PS never fixed this, even though I reported it in great detail to them as a bug. There are also issues like this one where you either have to set the carrier/group data manually through the back office or just write directly to the db table. Again, a trigger can be used to make sure this works but for some unknown reason PS declined to fix this problem. I don't know what the f#ck the point of automation is when it's not actually automated, but PS is PS. Maybe it's a French thing? Regarding my SQL commands for doing mass updates, I'll tell you the basic process I followed: Create a CSV file with all the price & stock update information. This will be imported into a MySQL table so make sure to have the necessary keys in the file, and the format correct for your table and for MySQL. Send the file to the server using SFTP. Using an SSH tunnel to the remote MySQL server, issue SQL commands to create a temporary table to import the data into. Create an SSH session that is able to issue command line commands on the remote server. Using the SSH session form #4, issue MySQL command line commands to import the CSV file into the temporary table. I wasn't able to do this with SQL as the MySQL server didn't have permission to read the file. Using MySQL command line commands circumvented this problem. The import, even for thousands of lines of data, will be nearly instant. Using SQL over SSH again, create the temporary table indexes needed for the updates. Don't skip this or your update speed will be terrible. Now that the indexes are in place use SQL UPDATE commands with table JOINs to update the target data from the data in the temporary table. As long as your temporary table indexes are in place this will be nearly instant. This somewhat convoluted method will take some effort to implement but will allow you to update price & stock data for 10s of thousands of items in a matter of a few seconds. I am setting up a new development server right now and when I get a few thousand products onto it I will do some tests and post some results. It's fast though, very fast. I needed a solution that would allow me to update 100,000+ products very quickly and this is what I came up with. As far as actually creating items on the remote server is concerned, I use the API to do this one item at a time. There is too much going on with the remote server during product creation to do this effectively using my system above. Especially since I need to get the 30bz remote database table ID for each product (to enable the high speed updating) I didn't want to screw around with trying to find a way to create products outside the API. Doing it over the API also lets me automate the process of uploading photos. As I don't have to create the same products over and over again (like with the stock & price updates) I don't mind if this process takes time. Hope this helps!
-
thirty bees is on Patreon, become a patron today!
dynambee replied to lesley's topic in Announcements about thirty bees
Oh wow, I didn't realize that. I thought it was the same sort of business model. The yellow one seems to be out of stock anyway. :( If I happen across any other companies that offer Cafe Press style shipping but with more color variety I'll be sure to let you know. -
thirty bees is on Patreon, become a patron today!
dynambee replied to lesley's topic in Announcements about thirty bees
I like this one, not that it's my decision. -
thirty bees is on Patreon, become a patron today!
dynambee replied to lesley's topic in Announcements about thirty bees
Vistaprint has some yellow mugs. :) -
thirty bees is on Patreon, become a patron today!
dynambee replied to lesley's topic in Announcements about thirty bees
Along the lines of ways to support Thirty Bees, how about some goods to buy? I'd buy one of those yellow mugs with the logo on it shown in the 1.0.2 default store! There are services like vistaprint and cafepress take care of everything so no inventory is needed, and they greatly simplify the design process. Of course they take a cut but even the Guns'n'Roses t-shirt & mug I ordered earlier this year came from such a company. (Went to their concert in Osaka but all t-shirts were small or medium... Daughter got to meet Slash though!) -
I've never had a problem getting SSL running with 30bz. What host are you using? What SSL cert? Which browser? You should be able to click on the lock and get a report about which items aren't secure. You probably have some URLs loading images that aren't using https for some reason, or you're loading images from a different site or something odd like that.
-
The API is the same as the PS 1.6 API so any example code or libraries for PrestaShop 1.6 should work fine. Personally I do my automation in .NET and use the PrestaSharp library that you can find freely on GitHub. If you have a lot of products however you will find that using the API to do things like price & stock updates is quite slow. API updates are done one-at-a-time so you can imagine that if you have 10,000 or 100,000 products (or even 1000 products!) that doing individual updates will take a LOT of time. With this in mind I directly write these updates to db tables. I described the process in a little more detail in this message, just slightly further back on this page.
-
thirty bees is on Patreon, become a patron today!
dynambee replied to lesley's topic in Announcements about thirty bees
Was hoping you guys would set up a Patreon page! Contributed and expect to increase the amount in the future. -
My opinion is that users should be in control of their content and allowed to edit and delete as they see fit. However I don't run TB so my opinion doesn't really matter. However if TB is going to block deleting (which has already been done) then something has to be done about editing. It's purely symbolic to remove the delete function when users can effectively delete their posts anyway.
-
I agree that allowing editing has benefits. I like Reddit's system where users can always edit or delete posts, but I realize it's not always ideal. However if users can always edit their posts there is no point in blocking deleting. As I demo'd above a message can be effectively deleted by editing it and removing the content.
-
Perhaps edits should be allowed for up to 30 minutes after posting and then the posts should be locked. That gives people enough time to fix typos and other errors but makes sure the post record is properly kept over the long term
-
I don't see a lot of point to removing the option to delete posts if people can still edit their posts. All a person has to do is edit their post and remove all the text and they have functionally deleted their post.
-
Holy shit. This is amazing. I will test next week. Thank you very much to @mdekker and @lesley for making this happen!
-
@daokakao said in Let's talk about Search!: One small question: have somebody heard something about SPHINX full-text search engine for MySQL? It already has modules for PS1.6 connection. http://sphinxsearch.com/ As its dev say, it is very fast and has reasonable memory consume. There are a few drawbacks to Sphinx and there are quite a few extra benefits to Elasticsearch. I covered a bit of this a few months ago in a different thread. Overall right now ES is the best open source search engine, and it is improving the fastest as well.
-
@mdekker: Welcome back and thank you very much for the update! Glad you took some time off, don't work yourself into burnout.
-
I'm definitely not expecting a finished module or even a beta, just wondering how development is progressing. I think an occasional status update (say, once a month, give or take) isn't too unreasonable a request.
-
Disable Payment modules "Carrier restrictions"
dynambee replied to Gontesque's question in Module help
Depending on how the module works you may be able to use a MySQL trigger to automatically enable all payment modules for these temporary carriers. You could try talking to the module creator about this. What module is it, and doesn't PS suffer from the same problem? -
I would certainly hope it will work with 1.0.x as well, that's the current long term stable release! It would be quite disappointing if the ES module wasn't compatible. :(
-
I've been away from the forums for a bit, my apologies if this has already been covered and I haven't seen it. I'm wondering if there is any update on the Elasticsearch module development?
-
@lesley said in Get this Wholesale thing correct.: What about Cost Wholesale Price Retail Price for the 3 pricing areas? I've been away for a bit and otherwise busy and unable to post here so my reply is a bit late. In our business we often have different cost prices from different suppliers for the same product. We base our retail prices on a combination of supplier priority, supplier price, and supplier stock availability. It's built into our pricing calculation system. I think 30bz should focus on being a top class system for selling online and make sure it can be easily integrated into popular ERP systems that are likely to be used by medium to larger businesses. Having a strong API is certainly a critical part of that but perhaps providing connectors for one or more popular ERP systems would be a good idea? It seems like this would be a better use of limited resources than attempting to reinvent the wheel by adding ERP features to 30bz itself. Just my 2yen's worth.