On 2019-03-24 11:11 AM, Tony Shelver wrote:
> Not the answer you are looking for, but...
>
> I'd suggest trying to create a non-trivial set of dummy data to test your
> assumptions before deciding on a route.
> It's saved my (professional) life a few times over the years when dealing
> with untested designs and new (to us) technology.
>
> Some years ago we were implementing an identity management system for a
> large US bank, with SQL Server as the store, with a planned integration to
> the id / access / permissions of some 300+ systems, targeting 30k plus
> users.
>
> Requirements kept changing as we added new systems to the initial mix,
> which the Id management package couldn't handle out the box, so we had to
> implement a custom design. We had to choose between 2 database designs,
> one being fully 'normalized' (considering that everything was an object')
> and one where we made some assumptions and fixed some table structures in
> the interest of performance.
>
> Eventually we spent a few days adding non-trivial amounts of test data to
> the proposed designs and it became quickly became very apparent that option
> 1 was unworkable once we got beyond 10 systems or so.
>
Good advice - much appreciated.
Frank