Inside Terality: the non-parallelized engine.

November 11, 2021
Guillaume Duvaux
Guillaume Duvaux

At Terality, we aim at building the fastest and easy-to-use serverless data processing engine for data teams. We want to offer the same experience as Spark in terms of speed and scalability, but with the same syntax as pandas and in a fully serverless way.

Pandas is the most used library by data science teams. We wanted to replicate its API so that you don't have to learn another syntax nor change your existing code (contrarily to PySpark or Dask, for example).

The pandas API is massive. Reimplementing and optimizing all the pandas functions will take some more time. Because we want everyone to use Terality before we complete this task, we designed a solution to cover the vast majority of the pandas API.

Therefore, we have released a side engine that will run all the functions we haven't optimized and parallelized yet. Conceptually, this side engine behaves like if you had a large server with hundreds of gigabytes of memory.

How does it work?

When you call a pandas function from your Notebook or IDE, it will run on our side, whatever the function:

  • If the parallelization is implemented, it will run on our main engine, super fast and without problems up to 100GB (and more in the future).
  • If not, Terality can still run your function, but on the "non-parallelized" engine. We don't recommend running workloads of more than 30GB on this side engine as of September 2021.

Terality main and side engines.

For you as a user, the switch between both engines is entirely transparent. We keep Terality's promise: it's fully serverless, scalable, and compatible with the pandas API.

To help us prioritize the functions we will implement in the main parallelized engine, our systems will automatically notify our team when a function called the "non-parallelized engine".

You can start using Terality today by visiting our Website and Documentation.

We are in beta in September 2021. Please, reach out to us from the live chat on our website or by writing to We are open to all feedback, remarks, and questions.

Interested in joining the team?

Home DocsIntegrationsPricingBlogContact UsAbout UsLog In