It's an easier problem than it seems, I think. I've done similar spreadsheets to calculate data growth when using deduplication in virtual tape libraries.
What I would do for this is use your formula in a spreadsheet where I had one line for each day.
I would have columns for InitialSize (that day's starting size), GrowthToday, ChangeToday, and EndSize.
Each row would start at that day's InitialSize, multiply it by GrothRate to get GrowthToday, and by ChangeRate to get ChangeToday. I then add the three together to get EndSize. Today's EndSize becomes tomorrow's InitialSize.
Repeat 264 times.
Your required backup space on any given day will be today's EndSize plus the sum of the last 30 ChangeToday values. Add some slack (such as for Leap Year, month-end close generating more data, unexpected data growth, etc.) and that should be your answer.
I've attached a spreadsheet with the above programmed in, for a single user with 2.2GB initial storage. You can multiply the BackedUp column by NumberOfUsers to get an aggregate number. You will have to figure your daily change rate. .5% is not a bad first guess, but it depends on whether your application is doing byte-level differencing, or storing the entire changed file... especially when you're keeping a copy every four hours.