Click here to close now.


Related Topics: Open Source Cloud, Java IoT, Microservices Expo, Linux Containers

Open Source Cloud: Article

Memory Monitoring and Limiting with LXC

Building a Metrics Infrastructure

Original article can be found here: Memory Monitoring with LXC

Memory Monitoring And Limiting With LXC

For a long time we didn't limit the amount of memory that you can use during your build on Codeship. There was the possibility of a bad build eating up all of our memory.

A few weeks ago that bad build happened, using up so much memory that it decreased performance and eventually killed the test server. Even though we measure the memory usage of the whole test server, we didn't have the data to figure out exactly which build caused the trouble.

Combined maximum and minimum memory usage of Amazon EC2 Instances.

Combined maximum and minimum memory usage of Amazon EC2 Instances.

How to avoid builds eating up all of your memory
We couldn't risk that this problem happened again and be a threat to other builds on that test server. We didn't have enough data about the memory limits at this point so we had to take an educated guess. My first assumption was the most conservative one. Each test server has 60G of memory and we run 22 builds on each instance, so each build can get a maximum of 2,5G memory. As LXC manages memory with cgroups, it is simple to set a memory limit for each container.

Setting lxc.cgroup.memory.limit_in_bytes in container config:

lxc.cgroup.memory.limit_in_bytes = 2560M

This worked, the bad build wasn't a threat anymore.

We got a few support request where people were asking about their builds being stuck. While the 2,5G are enough for 95% of the builds, the other 5% were hitting the limit. We needed to increase the memory limit to ensure that all builds are running fine. After 2 incremental steps The Codeship is sailing smooth with a 10GB memory limit for each build.

Codeship - A hosted Continuous Deployment platform for web applications

Monitoring in the Future
We wanted to have more data to improve the memory limit in the future. We started measuring the maximum memory usage for each build. We export it to Librato Metrics and save it as metadata of the build in our database. This allows to inspect the memory usage easily. We plan to show the memory usage to our users in the future.

LXC tracks the memory usage for each container and exposes that and many other values in the cgroup. Right before we shutdown the build, we read the memory usage from the cgroup. You can read the memory usage from the cgroup on the LXC Host.


and send the data to Librato Metrics.

Metriks.timer('build.memory_usage) .update(metrics[:max_memory])

Building a Metrics Infrastructure
It is important to back your actions with data. Data is your best argument. In case you don't have any useful data to solve the problem, it is important that you can easily add more metrics to your infrastructure. Sit back with your Coworkers and think about good metricsfor your product. Sometimes it's not easy to spot them on the first glimpse. We often talk about our metrics.

We discuss new metrics but also talk about removing redundant metrics. It is very exciting to talk about data and what your Coworkers conclude from that data. We are able to add new metrics to our infrastructure in minutes. By building this metrics infrastructure you can handle any new challenges by quickly adding mesurements that help you decide on the next steps. This infrastructure can be the difference between life and death of your service, so make sure you have it in place.

Do you monitor? Which tools and services do you use for it? Let us know in the comments!

Further Information on Linux Containers

Download Efficiency in Development Workflows: A free eBook for Software Developers. This book will save you a lot of time and make you and your development team happy.

Go ahead and try Codeship for Continuous Integration and Continuous Deployment! Set up for your GitHub and BitBucket projects only takes 3 minutes. It's free!

More Stories By Manuel Weiss

I am the cofounder of Codeship – a hosted Continuous Integration and Deployment platform for web applications. On the Codeship blog we love to write about Software Testing, Continuos Integration and Deployment. Also check out our weekly screencast series 'Testing Tuesday'!

Latest Stories
DevOps and Continuous Delivery software provider XebiaLabs has announced it has been selected to join the Amazon Web Services (AWS) DevOps Competency partner program. The program is designed to highlight software vendors like XebiaLabs who have demonstrated technical expertise and proven customer success in DevOps and specialized solution areas like Continuous Delivery. DevOps Competency Partners provide solutions to, or have deep experience working with AWS users and other businesses to help t...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Val...
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
SYS-CON Events announced today that Luxoft Holding, Inc., a leading provider of software development services and innovative IT solutions, has been named “Bronze Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Luxoft’s software development services consist of core and mission-critical custom software development and support, product engineering and testing, and technology consulting.
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
The APN DevOps Competency highlights APN Partners who demonstrate deep capabilities delivering continuous integration, continuous delivery, and configuration management. They help customers transform their business to be more efficient and agile by leveraging the AWS platform and DevOps principles.
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at di...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated a...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text...
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.