There’s no denying database applications have come a long way over the past few years.
Despite all the improvements, however, they’re still far from perfect—sometimes, they even feel painfully slow. A seemingly quick and easy task can end up taking hours for no good reason. The result? Angry users, suspicious managers, and a generally unhappy team.
A slow database isn’t something you can brush over. According to a SolarWinds®report, nearly one-third of data pros are responsible for over 300 databases in their organisations’ environments. Database administrators (DBAs) are already stretched incredibly thin, and—like most IT teams—they’re constantly being asked to do more with less.
Slow databases can often be accounted for. For example, it’s possible developers were only given functional requirements during the build process and no performance expectations were set. Another common hurdle is the team who wrote the query asked for a lot more data than they needed. As a general rule, if you ask for more data, you’ll spend more time waiting.
Unfortunately, even when users carefully craft a well-written SQL query, response from the database can still be sluggish. In this case, the application may perform well, but the reason for poor performance lies within the database itself. To combat this, you must first look in three places.
1. Review Your Indexes
Far too often, a task usually taking hours could instead take seconds—if only the database had proper indexes.
To boost performance, inspect the database design and make sure it has appropriate indexes. For custom-built applications, it’s also common to see duplicate indexes, which can damage performance.
Also, stay vigilant when buying third-party products. Vendors make sure the product runs on specific database platforms, but it seldom goes further than this. Performance tuning, particularly for third-party products built to run on multiple database platforms, is extremely rare.
Check the indexes on all your databases; you’ll likely see a positive response.
2. Check for a Data Type Mismatch
Understanding data types can do wonders for improving database performance.
A downfall for many is choosing to use broad and general data types. For example, a text data type with no limit to how much text it can hold—e.g., varchar (max)—is detrimental to performance.
To choose the correct data type, developers need information such as how the data will be used, how much space is needed to properly store the data, and how likely the data will be used in the future.
Selecting the right data type is crucial, but it’s even more important to ensure the same type is used consistently throughout the database and the applications using the database. Doing this dramatically improves performance and ultimately leads to much-needed cost savings.
3. Investigate the Entity Framework
The Entity Framework (EF) is an open-source object-relational mapper (ORM) designed to enable .NET developers to work with a database using .NET objects. EF is an example, but this advice also applies to all ORMs, including TypeORM, Waterline, Hibernate, and NHiberate.
With EF, building the front-end app is faster and easier, especially when compared with hand-writing SQL to enable the simplest types of data access. This means developers can be much more productive for simple create, read, update, and delete data (CRUD) requirements. With the right skills, a talented developer using EF can quickly and effectively build most database-driven web applications.
It’s important to note ORMs are good at making simple CRUD screens. However, if you want to make sophisticated user interfaces, ORMs write low-quality SQL code when compared with hand-written SQL.
If you have application performance problems, check to see if it’s caused by the EF or another ORM used during the development process.
Bonus: Get a Database Performance Monitoring Tool
When looking to improve database performance, a good database performance monitoring (DPM) tool can be invaluable. These tools monitor an organisation’s databases from a single pane of glass, whether they’re on-premises, in the cloud, or in a multi-cloud environment.
Having a DPM tool in place provides easier and faster access to vital performance data, and using it during development leads to a better-optimised, more efficient database design.
Additionally, DPM tools accelerate problem resolution and troubleshooting in production environments while strongly guiding the development of database queries and applications towards better-performing alternatives.
A strong DPM tool uses its own database to store performance metrics produced automatically by the operating system, database server, and application. This means it can also provide long-term trending information. What’s more, it’s able to do all of this while adding relatively little overhead to ongoing database activity.
When searching for the right tool, it’s crucial for whichever tool you choose to be able to monitor cloud-native, on-premises, and hybrid production and test databases at scale without too much performance overhead.
Deep down, database professionals know many of the problems they face diminish, or even disappear, when their organisation is adequately staffed. However, while we wait for the boardroom to address this skills gap, we must look to the fixes providing the most improvement while requiring the least amount of effort.
It’s important to remember databases don’t have to be slow. Gone are the days of the horse and carriage. It’s time to speed things up.