Optimized data management is at the core of every successful e-commerce operation. For fabric users managing extensive product catalogs, finely tuned data ingestion is paramount.
Adhering to fabric’s best practices will ensure the fastest processing speed, optimum resource management, and enhanced accuracy when importing your data.
This topic covers subjects such as file size restrictions, types of import actions, reconciling errors, and most importantly, the best method of data ingestion: delta updates.
Before you upload your first file, it’s important to understand file size restrictions and how fabric handles files that exceed those restrictions.
A delta update involves transmitting only the changed data fields when making an update. This is in contrast to the more traditional “full feed” updates that send the entire dataset. By sending only the changed data fields, fabric can process updates without reprocessing unchanged data.
Delta updates are the preferred method for all uploads.
Full Feed Data Updates | Delta Data Updates | |
---|---|---|
Resource Usage | Requires more resources | Requires fewer resources |
Processing Time | Longer processing time | Shorter processing time |
Data Transmission | Transmits entire dataset | Transmits only modified data fields |
Network Bandwidth | Consumes more network bandwidth | Requires less network bandwidth |
Storage | Requires more storage space | Requires less storage space |
Error Handling | Prone to errors during full data transmission | Less prone to errors due to focused updates |
Scalability | Less scalable for large datasets | More scalable, especially for large datasets |
Data Accuracy | Potential for data redundancy and inconsistency | Enhances data accuracy by focusing on changes |
Operational Efficiency | Lower operational efficiency due to higher resource consumption | Higher operational efficiency due to optimized resource usage |
Incremental Updates | Updates entire dataset each time | Updates only modified data fields incrementally |
You can import data into fabric using the following methods:
The import method you choose is up to you, but in each case, uploading smaller files and using the delta update method will result in quicker processing, better resource management, and a higher degree of accuracy.
It’s crucial to make sure your dataset is accurate and compatible with fabric’s formatting before initiating the upload process. Validate your data to avoid errors by reviewing the file to identify any changes since the last upload and confirm that the data structure and format are correct. See the following pages for formatting guidelines:
The actions you use when importing items, bundles, categories, and collections tell fabric how you are modifying your data. The following actions are available:
If there are errors during processing, download the error file and review each error to identify the problem. Correct the errors by updating the CSV file with the necessary changes and validate the corrected CSV file before re-importing.
Optimized data management is at the core of every successful e-commerce operation. For fabric users managing extensive product catalogs, finely tuned data ingestion is paramount.
Adhering to fabric’s best practices will ensure the fastest processing speed, optimum resource management, and enhanced accuracy when importing your data.
This topic covers subjects such as file size restrictions, types of import actions, reconciling errors, and most importantly, the best method of data ingestion: delta updates.
Before you upload your first file, it’s important to understand file size restrictions and how fabric handles files that exceed those restrictions.
A delta update involves transmitting only the changed data fields when making an update. This is in contrast to the more traditional “full feed” updates that send the entire dataset. By sending only the changed data fields, fabric can process updates without reprocessing unchanged data.
Delta updates are the preferred method for all uploads.
Full Feed Data Updates | Delta Data Updates | |
---|---|---|
Resource Usage | Requires more resources | Requires fewer resources |
Processing Time | Longer processing time | Shorter processing time |
Data Transmission | Transmits entire dataset | Transmits only modified data fields |
Network Bandwidth | Consumes more network bandwidth | Requires less network bandwidth |
Storage | Requires more storage space | Requires less storage space |
Error Handling | Prone to errors during full data transmission | Less prone to errors due to focused updates |
Scalability | Less scalable for large datasets | More scalable, especially for large datasets |
Data Accuracy | Potential for data redundancy and inconsistency | Enhances data accuracy by focusing on changes |
Operational Efficiency | Lower operational efficiency due to higher resource consumption | Higher operational efficiency due to optimized resource usage |
Incremental Updates | Updates entire dataset each time | Updates only modified data fields incrementally |
You can import data into fabric using the following methods:
The import method you choose is up to you, but in each case, uploading smaller files and using the delta update method will result in quicker processing, better resource management, and a higher degree of accuracy.
It’s crucial to make sure your dataset is accurate and compatible with fabric’s formatting before initiating the upload process. Validate your data to avoid errors by reviewing the file to identify any changes since the last upload and confirm that the data structure and format are correct. See the following pages for formatting guidelines:
The actions you use when importing items, bundles, categories, and collections tell fabric how you are modifying your data. The following actions are available:
If there are errors during processing, download the error file and review each error to identify the problem. Correct the errors by updating the CSV file with the necessary changes and validate the corrected CSV file before re-importing.