Schema validation in ONIX 3.0 vs. 2.1

Getting ready for ONIX 3.0? You should take note of the changes to schema validation.

Anyone who produces ONIX files should be aware that ONIX 3.0 is designed for validation by schema. BookNet previously promoted using schema validation for ONIX 2.1 to help pave the way for this but mostly we ask because schema validation finds real data problems. Simply put, schema validation ensures your ONIX file is loadable and readable in other company's systems: it's the minimum acceptable standard for trading data.

You might not know that the ONIX 3.0 schema is already stricter than the one you should be used to for 2.1. Try it for yourself: put a page return in SubjectHeadingText (b070) and watch your 3.0 file validation fail. That's useful because data aggregators and retailers shouldn't have to deal with page returns in the middle of keyword lists, so data senders should clean that up as part of the transition to 3.0. With the release of codelists issue 30, EDItEUR added newly updated "strict" versions of the schema. There's the standard "formal" schemas, plus two marked as "strict" that are set to replace the norm.

If you have an ONIX 3.0 file, you should see if it works against strict versions of the schema. Start by reviewing this document to see if what's in your database can pass against these stricter measures. Some elements are being restricted to integers, positive numbers, and real numbers. Here's a quick tip: measurement values and price amounts of "0" won't pass. They have always been a poor practice and they'll be less than the minimum acceptable standard.


This originally appeared on the BNC Blog at https://www.booknetcanada.ca/blog/2015/8/11/onix-30-schema-getting-stricter. Subscribe the the blog RSS at https://www.booknetcanada.ca/blog?format=rss.