<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=2283234405279860&amp;ev=PageView&amp;noscript=1">
Try NUITEQ Snowflake

NUITEQ successfully deployed new NUITEQ Snowflake data architecture

NUITEQ, successfully deployed the new architecture and backbone of the cloud-based version of its award-winning educational solution NUITEQ Snowflake, to accommodate the growing demand for the company's web-based EdTech services.

This expansion enables even more efficient monitoring, improves stability even further, and improves performance for the rapidly expanding NUITEQ Snowflake Community.

Our uptime for the cloud-based version of NUITEQ Snowflake has gone up from 98% to 99,99% and the loading speeds have improved between 20-59%, so we are extremely excited about this new release, for our users.

Harry_square_small

  Harry van der Veen 

  Chief Executive Officer
  NUITEQ

"This transformation of our system has been in the making for over a year and we're proud of the fantastic work our developers have delivered. We now use the same architecture as companies like Facebook, making our platform future proof for extreme scalability, reliability, and performance.", says NUITEQ's CEO and co-founder Harry van der Veen. 

To give a peek behind the curtains, NUITEQ's CTO Johan Larsson and Chief Software Architect Sharath Patali share their technical perspectives.

Be aware: geek talk ahead ❤️


Stora Johan_2

  Johan Larsson 

  Chief Technical Officer
  NUITEQ

"After a large influx of users in 2019 and 2020, we started to notice challenges with our previous database engine. We were basically starting to run into the edge cases of CouchDB, where we were trying to perform task that required high volumes of views of the database and enormously complex object hierarchies in our documents. That did not scale like we wanted to, which meant we had to perform significant manual work to maintain reliability and performance was not where we wanted it to be, as we always try to raise the bar and challenge ourselves.

We started immediately with finding solutions for these challenges and realised that they would be best solved by a classic SQL solution. The best performing and most reliable option for us was PostgreSQL. We early on found TypeORM, which lets us define the database Object Relational Model, in a way which worked the best for us.

The first database we worked on was the one where we store lesson activities, due to this part of the database being in the most need of change. It holds hundreds of thousands of lesson activities, making it definitely a challenging task. However, after some long hours and a lot of coffee, we eventually managed to migrate it, which lead to really good improvements and meant that navigating lesson activities was significantly faster than ever before and worked every time.

This was a very complex transition, as we have large amounts of data that is very important to our users. Many teachers have their entire semester’s worth of lesson activities in our system and we cannot have any data loss, so we had to ensure that this move went smoothly without a hitch. So architecting such a move required that we could spin up the new database engine, then migrate the old data across and then verify that everything was where it should be for every user before switching over to using the new database.

When that happened in 2021, it was a successful seamless transition for the users and most users really had no way of knowing that it had happened, except that managing their lesson activities was faster and our uptime numbers had improved even further.

However, we knew that we would also need to start working on the other part of the data storage for Snowflake.live: The user database. This is the database that holds user information, classes, etc. It’s a database that is significant and always growing with new users being added to Snowflake.live every day. It also has complex relations in how users interact with each other, as we have a significant number of class management tools, the ability to assign lessons and get data back from the students about how they performed, integrations with external management tools such as ClassLink etc.

Early on in the process, we decided to take the approach to execute it piece by piece and to migrate the data for one feature at a time in the development version of Snowflake.live. This ended up being critical to ensure that all the old data, which has been there for years in the case of some of our users, have been with us for a long time. And this got migrated correctly to the new structure.
When using a freeform document-based NoSQL database such as CouchDB, data can be formatted in many ways and we had to spend a lot of time devising a scheme that works for all our older user data, that is extensible to the development of new features, but also that is strict enough to enforce consistency even at a database level."


Sha_2

  Sharath Patali 

  Chief Software Architect
  NUITEQ

Ever since the inception of the online version of Snowflake several years ago, we have seen organic growth of our user base, however, in the last two years, we have seen far more significant growth in usage. This usage growth has been fuelled by our team's focus on improving the signup process, our tight integrations with rostering systems like Classlink, and our growing relationships with our partners. This increase in usage has shown us the limitation of using complex data structures inside CouchDB. Therefore over a year ago we started a massive project of rewriting Snowflake’s backend to use an ORM and PostgreSQL.

As the first step, we decided to migrate all the Lessons data from CouchDB into a PostgreSQL database. This was a complex endeavour, considering the fact that we had to migrate all those years of accumulated lessons data into a new structure. The process was very much like the common saying of “Replacing the engine of the car while it’s moving at full speed”. All the hard work paid off in the end. We noticed improved loading times, reliability and significantly reduced down times. This gave us more confidence in tackling the bigger database in the Snowflake architecture: the User Data.

Usage of NoSql database over many years has mutated the structure of the Users data to be a complex beast (at the time there was no other way). As our user base grew, we started to realise that this cannot scale well enough to the level that we require. After many weeks of discussions, we arrived at engineering a multi-tiered  solution of splitting the user data into related data blocks. We approached the challenge from the bottom up and started moving blocks of data from users data into separate tables in PostgreSQL. Almost an entire year of careful work, to test and migrate individual blocks and then interlink them. This also meant a massive rewrite of the backend architecture. So we had to thoroughly test and retest every feature we ever built, as Quality Assurance is something we hold high dearly.

We migrated all of the users data into the new database. Our users will  now experience improved page loading speeds, better availability  and scalability. Users will experience improved speed and reliability when managing lessons, classrooms, sharing homework with students and reviewing their responses.

Apart from migrating the data, we have also been working on improving the automations and DevOps process around our NUITEQ Snowflake servers. We now have even better monitoring, logging and error reporting, as well as improved and upgraded server specifications.

This has been a long journey for us and we are super excited to see our users finally experience what we have been working on for over a year!

Learn more about our award-winning educational solution NUITEQ Snowflake by signing up for a free trial.  

Try NUITEQ Snowflake

Contact us, if you're interested in NUITEQ Snowflake for your school or district. 

1