I have a customer who has a large amount of data in their SQL databases (almost 4TB). When they run indexing on the databases it causes large changes to their backup files (500GB worth) that then needs replicated off-site and causes lots of storage issues. I've seen a few scripts out there that can look at the databases daily and will only index them if it's a certain % fragmented. I'm not a SQL dba, but I've been tasked with figuring this out as my company is responsible for the backup of this server. Can anyone point me in the right direction? Any help is greatly appreciated.
↧