In recent years we have seen an explosion of public data from governments and public sector institutions, providing transparency to citizens and new opportunities to benefit society. This information is being used in many big data projects from demographics for health care to machine learning algorithms for modelling behaviour. The scale and frequency of data makes analysis difficult and one of the key aspects is to find change that matters, such that differences can be processed efficiently and understood. DeltaXML is helping organisations across the globe with XML comparison and merge software tools, and now has a data optimised version that is up to 99% faster: XML Data Compare.
During the current pandemic statistical analysis has been included in every news item, article, and conversation to tell the story of Covid. Organisations across the globe want transparent information about infection rates, cases, deaths and vaccinations from their governments and public institutions to inform their decision making. Everyone demands this information quickly and accurately, but it is constantly changing, so how do we make sense of change in such vast and rapidly changing data?
There are many software tools available for comparison of datasets which can help analyse change in data, but few understand the data in the context of its structure and this is where we have made performance gains. Our customers analyse change and representing difference in a timely manner is a sophisticated problem which offers significant value. Our research and development has focused on two common scenarios: large scale datasets; and providing a delta or set of changes. Developing our algorithms, together with updating our programming methodology we have been able to unlock significant gains in performance. Which has the additional benefit of reducing your compute costs by over 90% in private cloud architectures using on demand infrastructure.
For example, the provision of large daily data sets for infection rates, cases, deaths, and vaccinations that change daily retrospectively updating historical information, means data analysts need to compare that data with their current model to determine what changed and update the relevant data. Processing can be time consuming, resource intensive and prone to statistical or processing errors. Our software compares the current data set with the previous using sophisticated context aware data comparison and produces a considerably smaller file of changes which will be completely accurate and will update your data model in a fraction of the time.
DeltaXML’s new XML Data Compare software can help your organisation compare or merge data as part of your application workflow using our REST server in your own private infrastructure. The tools are highly scalable and easier to configure, whilst retaining quality of comparison results for your most demanding applications. Our experts will guide you through the evaluation of our tools, providing webinars, remote installation and even test sample files to get you up and running quickly. Why not reach out to DeltaXML today, make the change and we’ll show you the change.