are you aware of pushing compute to your database?
This could be a trivial issue or catastrophic performance issue at the end of the day. Linq is so easy to use and abuse, so most of developers are not aware that some of the compute is moved from their server to database instance.
Is this a bad thing? As always it depends on many factors, as application load schema, how it is architected, what kind of load balancing is used, etc, etc.
In many scenarios with business applications build in C#, we have a server and a database, where server can be behind a load balancer (so we could have 3 instances) and SQL server is in a cluster with log shipping (mostly in a hot/cold configuration) for redundancy. A here is the catch: we always can add more workers behind the load balancer (just more pods in Kubernetes), but it is hard to scale-up database service. Unless we use CQRS and/or caching then we are in the performance trap, as shifting compute to DB means less resources available for other queries.
So how we can do it?
Have a look at snippet below and think if there is a catch?
Below a client statistics output for a query with shifted compute.
We need to fetch data first and then process on API/server side.
Comments
Post a Comment