Increasing Server Capacity
Our Bioinformatics team consisting of Konstantin and José, have taken it upon themselves to improve our computational setup. From now on, our computational system will include three nodes: Each node has its own purpose and function. Our largest node with a 24 CPU, 64 RAM and 324 TB of HDD will be used for storing our data sets. The second and largest node with 128 CPU cores, 1024GB of RAM and 64 TB of SSD storage will be used as computational node. The computational node provides 1 TB working memory and allows to run our pipelines at increased speed. Finally yet importantly, we now have added a data warehouse node with 32 CPU, 128 RAM and 100 TB of HDD storage. Alongside our nodes now, also include two graphical processing units to perform AI-based algorithms. Altogether, these sums up to almost 200 CPU cores to provide peak performance and allow around 500 TB of storage space. We therefore increased the central drive for our working group members and computational capacity to lead the way for new upcoming analysis.