Template for a very slow database server

I am building an MVC Asp.net site where I have a dedicated dedicated server for a web application, but the database is stored on a very busy Sql Ms server used by many other applications.

Also, if the webserver is very fast, the application response time slows down mainly for the slow response of the db server.

I cannot change the db server as all data entered into the web application should arrive there at the end (for backup reasons).

The database is only used from a webapp and I would like to find a caching mechanism where all data will be cached on the web server and updates will be sent to the db asynchronously.

It is not important for me to have a direct correspondence between the db read data and the inserted data: think about how to read StackOverflow questions and new inserted questions that don't have to appear right after the insert).

I was thinking to create a between WCF service that will exchange and sync data between a slow db server and a local one (maybe Sqllite or SqlExpress).

What would be the best model for this problem?

+1


source to share


7 replies


What's your bottleneck? Reading data or writing data?

If you are into reading data, using a memory-based data caching engine like memcached will be a performance booster, since most major and largest websites do it. Scaling facebook hi5 with memcached is a good read. In addition, the application-side implementation of caches will discard requests made by the application, triggering lower DB load and better response times. But this will not have much of an impact on the load on the database servers as there are other heavy users in your database.



If writing data is the bottleneck, implementing some kind of asynchronous staging data store seems necessary. If you have fast and slow latency retention times on your frontend server then you would be using lightweight database storage like mysql or postgresql (maybe not so easy;)) and using your real database as slave replication server for your site is a good choice for you.

+3


source


I would do what you are already considering. Use a different database for the application and only use the current one for backup purposes.



+1


source


I had this problem once and decided to go for a combination of datastores (i.e. fetch data from the database from time to time and store it in a separate read-only database) and message queue via a windows service (for updates. )

This worked surprisingly well because MSMQ ensured reliable message delivery (no updates were lost) and the datastore ensured that the data was in the local database.

However, this will still depend on several factors. If you have tons of data to migrate into your web application, it might take a while to restore the storage and you might need to consider data replication or transaction log shipping. In addition, changes are not displayed until the warehouse is rebuilt and messages are processed.

On the other hand, this solution is scalable and can be relatively simple to implement. (You can use integration services to pull data into the warehouse, for example, and use the BL layer to handle changes.)

+1


source


There are many replication methods that should give you the correct results. By installing an instance of SQL Server on your configuration "website", you have a choice between:

  • Performing snapshot replication from a web page (publisher) on the database server side (suscriber). You will need the paid version of SQLServer on the web server. I've never worked on a configuration like this, but it could use a lot of webserver resources in the scheduled sync times.
  • Implementing merge replication (or on-demand transaction) between the database server side (publisher) and a website (subscriber). Then you can use the free version of MS-SQL Server and schedule the sync process to run according to your tolerance for potential data loss if the web server is down.
0


source


I wonder if you can improve its adding the MDF file on your web side by contacting Sever on a different IP instead ...

Just add the SQL 2008 Server Express Edition file and try, if you don't go through 4Gb of data, you will be fine, of course there are more limits, but, just for its speed, why not try it?

0


source


You should also consider network switches. If the DB server is talking to multiple web servers, it may be limited by the speed of the network connection. If they are only connected via a 100MB network switch, you may need to upgrade it as well.

0


source


a WCF service would be a very poor engineering solution to this problem - why create your own when you can use the standard SQLServer connection mechanisms to ensure correct data transfers. Log shipping will send data at selected intervals.

This way you get a fast local SQL server and the data is saved correctly on the slow backup server.

You should investigate a slow sql server, although the performance issue might not be related to its load and more to do with the queries and indexes you are asking to work with.

0


source







All Articles