Jump to content
thirty bees forum

Updating products from Filemaker ?


ssimard

Recommended Posts

We have a Filemaker database and would possibly like to update our online store directly from it.

We use it for the internal management of our projects (author rights and the like since we are a publishing company). They would like to reduce the data entering so synchronizing the website from that database would be cool).

I know ODBC can be used to access Thirtybees tables but what about image creation, product removal etc.. I fear some mechanisms wont be triggered if I do it that way (instead of importing through the TB .csv import module).

Is there an already made module or anything that could make my life simpler ? Is there an API of some sort ?

Thanks.

Link to comment
Share on other sites

We have a very similar situation with a Windows-based POS system that needs to be synced into TB.

I wrote a program that runs as a service on the same machine as the Windows POS system, periodically reads the POS database, and sends the data over to TB using the webservice API. It handles product changes/additions/removals, stock changes, images etc etc.

The webservice API is quite rich in functionality, but it takes some programming skills and effort to use it.

Link to comment
Share on other sites

We also use a high degree of automation in our business. We use the webservice for some things and for others we write directly to the database itself over a secure connection.

The web service is ideal for creating the listings when a site is first launched as you can upload photos and you can be sure that the site and each item will be properly set up.

On the other hand if you need to update prices for 100,000 different products (for example) it will take a long time over the webservice and require many separate calls. However if you upload a CSV file with new prices, read it into a temporary table using MySQL commands, and then use SQL joins to update the data, you can update many thousands of items in just a few seconds.

Of course writing directly to the database tables is not a recommended or supported idea. Newer versions of 30bz will probably change the database structure somewhat and any code that writes directly to the database tables many need to be rewritten to be compatible with new versions of the platform. This is the risk, but until there is a bulk update functionality in the webservice (which I hope happens!) direct table updates are the only viable way to update vast amounts of data in a reasonable amount of time.

Link to comment
Share on other sites

  • 3 months later...

Would anyone that has written code to use the api/webservice be willing to share? When googling for sample scripts I'm not getting many usable results. For now I am using the csv import but in the near future I would love to add products, change stock/prices and so on with the webservice. On the TB site there is nothing but an overview of the classes and controllers.

Marc the Netherlands

Link to comment
Share on other sites

The API is the same as the PS 1.6 API so any example code or libraries for PrestaShop 1.6 should work fine.

Personally I do my automation in .NET and use the PrestaSharp library that you can find freely on GitHub.

If you have a lot of products however you will find that using the API to do things like price & stock updates is quite slow. API updates are done one-at-a-time so you can imagine that if you have 10,000 or 100,000 products (or even 1000 products!) that doing individual updates will take a LOT of time. With this in mind I directly write these updates to db tables. I described the process in a little more detail in this message, just slightly further back on this page.

Link to comment
Share on other sites

The API is just sending & receiving data as XML. You can see the required XML formats directly on your own server, and they are talked about in the PS 1.6 documentation as well. For example, for products:

https://www.yourdomain.com/api/products?wskey=putyourwebserviceapikeyhere&schema=synopsis

Obviously make sure you have SSL working first so you aren't passing your webservice key over plaintext.

You can see a full list of the different API calls with this:

https://www.yourdomain.com/api

Substitute call names into the previous URL (replace "products" with the new call name) and you will get the XML file layout information.

The PS webservice is not perfect and there are some problems, in my experience, with NULL values. The webservice does not handle them properly and some calls require them. This is hit and miss and you will have to look at the sample products/customers/addresses/shops/etc to see where a NULL is required but a 0 has been inserted. The easiest way to fix this will be by using triggers on the MySQL server that run whenever a new record is inserted into these tables. The trigger can update the new records and put in the NULLs. It's a PITA but PS never fixed this, even though I reported it in great detail to them as a bug.

There are also issues like this one where you either have to set the carrier/group data manually through the back office or just write directly to the db table. Again, a trigger can be used to make sure this works but for some unknown reason PS declined to fix this problem. I don't know what the f#ck the point of automation is when it's not actually automated, but PS is PS. Maybe it's a French thing?

Regarding my SQL commands for doing mass updates, I'll tell you the basic process I followed:

  1. Create a CSV file with all the price & stock update information. This will be imported into a MySQL table so make sure to have the necessary keys in the file, and the format correct for your table and for MySQL.

  2. Send the file to the server using SFTP.

  3. Using an SSH tunnel to the remote MySQL server, issue SQL commands to create a temporary table to import the data into.

  4. Create an SSH session that is able to issue command line commands on the remote server.

  5. Using the SSH session form #4, issue MySQL command line commands to import the CSV file into the temporary table. I wasn't able to do this with SQL as the MySQL server didn't have permission to read the file. Using MySQL command line commands circumvented this problem. The import, even for thousands of lines of data, will be nearly instant.

  6. Using SQL over SSH again, create the temporary table indexes needed for the updates. Don't skip this or your update speed will be terrible.

  7. Now that the indexes are in place use SQL UPDATE commands with table JOINs to update the target data from the data in the temporary table. As long as your temporary table indexes are in place this will be nearly instant.

This somewhat convoluted method will take some effort to implement but will allow you to update price & stock data for 10s of thousands of items in a matter of a few seconds. I am setting up a new development server right now and when I get a few thousand products onto it I will do some tests and post some results. It's fast though, very fast. I needed a solution that would allow me to update 100,000+ products very quickly and this is what I came up with.

As far as actually creating items on the remote server is concerned, I use the API to do this one item at a time. There is too much going on with the remote server during product creation to do this effectively using my system above. Especially since I need to get the 30bz remote database table ID for each product (to enable the high speed updating) I didn't want to screw around with trying to find a way to create products outside the API. Doing it over the API also lets me automate the process of uploading photos. As I don't have to create the same products over and over again (like with the stock & price updates) I don't mind if this process takes time.

Hope this helps!

Link to comment
Share on other sites

Should add that if you are asked for authentication by the API then the username is your webservice key and the password is blank. This should only happen with the https://www.yourdomain.com/api URL as the other format has the key included already in the URL.

Link to comment
Share on other sites

@mdekker said in Updating products from Filemaker ?:

Did I read a bug report in your long post?

Yes, the PS API doesn't support sending NULL values, as far as I am able to figure out.

In many cases this isn't a problem because if you send nothing then the default value is used and the default value is NULL. However in at least one case (detailed in my PS bug report that I linked to) it is required to send a value but NULL can not be sent because the API doesn't support sending NULL. Well...you can send NULL, but it will either cause the call to fail or will be interpreted as a zero. This results in having to manually update the db table to set those zero values to NULL, either with a MySQL trigger or by directly writing to the table after using the API.

There is also the second issue I linked to where the API does not seem to provide a way to set Carrier Group Access. As such any carrier created by the API can not be used until this is manually set -- or you can do it by writing to the db table, or with a trigger.

The PS API works fairly well but isn't quite complete and isn't very well documented. It also doesn't have a way to do bulk updates. Being able to send a file through the API and have a high speed bulk update happen would be a great addition.

I'm hesitant to request too many changes to the 30bz API though as so far I have been able to reuse all the code I developed for PS without making any changes to it...

Link to comment
Share on other sites

  • 3 weeks later...

@lesley said in Updating products from Filemaker ?:

I have a client that does this, he puts directly into the database and does not use the API from my understanding.

This is what I do for price & stock updates, and is also how I create our somewhat complex 130 country carrier/delivery structure.

However for adding products to the database I just use the API. Products only need to be added once so even if there are 100,000 of them letting the API do it's thing (including all the photos) isn't a huge issue.

Link to comment
Share on other sites

@dynambee for products I am now using this script: https://coderwall.com/p/fwoaqq/prestashop-background-cron-csv-products-import-modified I have adapted it a bit for virtual products. It is a handy way to import large numbers of products. Just hoping the backend (import) will not change with updates. For now I need to concentrate more on module development for ESD Delivery and License codes. Price and Stock updates are a breeze with your explanation. I saw it as a handy way to import large numbers of products. Just hoping the backend (import) will not change with updates.

Regards, Marc

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...