cancel
Showing results for 
Search instead for 
Did you mean: 

Compress Long Binary Column

thelmacottage
Participant
1,668

I would like to compress a Long Binary column in a table (email .msg files, PDF's, JPG's). We have a replicating DB's

If I apply ALTER Table ALTER Column COMPRESS at the consolidated DB (with passthrough) - will I have to dbunload -ar each remote DB and the consolidated to reclaim disk space ?

Accepted Solutions (1)

Accepted Solutions (1)

Breck_Carter
Participant

Yes and no.

If "reclaim disk space" means "reduce the size of the *.db file" then yes, recreating the *.db file from scratch (dbinit dbunload reload etc) is the only way to do that... it is not limited to COMPRESS.

If "reclaim disk space" means "reuse free space for new or expanding rows" then no, freshly-created free space in a table data page can be used for new data in the same table, and free pages can be used for anything, without having to recreate the *.db file.

FWIW you might also consider a deduplication algorithm if that is a common condition (same object stored in multiple rows).

VolkerBarth
Contributor
0 Kudos

Just to add:

Whether your compressed data will really require less table pages as before (or even more) should be checked via the "sa_column_stats" system procedure - I had tested with "medium length" strings a while ago (256 bytes each) just to find out they were not compressed at all, which led to that DCX remark...

That being said, I guess for typical BLOBs like PDFs and images compression should acutally take place.

Answers (0)