You need to collect data from the Wildberries statistics API on a daily basis
We are looking for a contractor who has already worked with the Wildberries API.
You want to collect the following data from the Statistics API:
Warehouse service methods: collect all available data on balances once a day. The dateFrom field has no value, the service always returns only the latest data. It is necessary to collect information at 4:00 in the morning moscow time, since the data is updated 3 times a day and at this time I will receive approximate data at the end of the day. You also need to add a new Date column with the upload date to the retrieved data. The API can return an empty array or an error, so you should consider sending a second request if any of these events occur.
Method of service Orders: collect all available data for the last day at 10:00 Moscow time. You must also add an additional Date column. The API can return an empty array or an error, so you should consider sending a second request if any of these events occur. Rewrite the results after a week: that is, after a week, delete all the rows for the day a week ago and make a new query and re-write the data. This is necessary because of errors in the data when accounting for the last days. Perform this operation a week later at 05:00.
Method of the Sales service: collect all available data for the last day is allowed at 10:30 Moscow time to divorce with a request for orders. The API can return an empty array or an error, so you should consider sending a second request if any of these events occur. Rewrite the results after a week: that is, after a week, delete all the rows for the day a week ago and make a new query and re-write the data. This is necessary because of errors in the data when accounting for the last days. Perform this operation a week later at 05:00.
Sales Report sales method: Collect all available data from the request on Wednesday at 08:00 Moscow time. It is important to note that we can have very long reports, and the WB Api returns no more than 100,000 rows, so do a check on the number of rows in the first request and if there are 100,000 of them make a new request with the addition of the rrdid of the last row.
Since you add a date in the table with balances by warehouse, orders, sales, you need to make sure that the date format is the same everywhere.
It is necessary to deploy the database on postgres and place it on the server or a script that will make requests according to the regulations. If you need to write a script, you need to use the python language. The database should be able to connect from an external environment, in particular, through power BI.
It is also necessary to make short documentation and leave comments in the code so that other specialists can quickly understand the architecture of the solution.
At the next stages of work, it will be necessary to develop a telegram bot for uploading data, the possibility of their "replacement". At the third stage, it will be necessary to compile reports on the basis of this data. These further parts of the work will be discussed further after the completion of the first stage, if you are interested. In the future, it will be necessary to repeat all these steps for Ozon.
19.07.2022 20:10