Administration, Azure SQL Database, Editorials, Encryption/Data Security, SQL Server

Inventory control and your databases….

No, not the “what servers do you have and what do they have on them” variety necessarily.  In a previous life, inventory and point of sale systems were a focus, and one of the great things that we could nearly always point to surrounded that initial inventory of the store location – their stuff on the shelves.  So many times, both we, and the store owner were surprised by the inventory, from how much there was to things they hadn’t seen in awhile, stuff back on some shelves behind other products or just generally not known.   In many cases, just the inventory we “found” more than paid for the new software and systems – it could be quite substantial.

With your database systems, you may find a similar situation once you really start looking at things.  Many of the data folks out there are going through the processes of cataloging and classifying systems.  From the vulnerability testing mentioned earlier to getting a good understanding of the information and protection of that information that is in the system now.

Many times, we’ve had pushback from people that “know for sure!” what’s in their systems and databases – the pushback comes when we suggest that a fresh look, usually an automated look, is in order to make sure all of the pieces expected to be there, are, and that there aren’t pieces that are NOT expected to be there.  It’s been a common occurrence during the review cycles to find unexpected tables or other resources (views, stored procedures, other things working with the data) that are unexpected.

This is a huge reason we suggest, as a starting point, an automated tool.  Staring with something as simple as the data classification tool can expose unexpected tables and even unexpected columns and data in your systems.  It might not be peeling back too many layers on the actual information stored, but you may suddenly find that you have extraneous tables hanging around.

This has been the case several times where information was being modified, so backup or modified temporary tables were created to help in the migration or “just in case” processes.  After the full operation was completed (the migration or update or whatever was happening) that temporary table was never removed and remained… and in some cases, had information in it, just sitting there, with not much attention or awareness… until of course your automated tool came along and pointed it out, saying “hey, this should probably be classified as Confidential.”

Not only does the information need to be classified and protected, but that table itself may be an issue and something that needs to be addressed.  These automated tools have no agenda and no expectation of what’s in your system.  They’re a solid place to start and provide broad-based information about what you need to work through.  We’ll be talking more about the classification tool (SSMS -> database -> tasks -> Classify Data) but it’s worth a run through to make sure you’re covering all of the information bits that are in your databases… and that you know about all of those bits, not just the expected pieces.