Linked by Eugenia Loli on Wed 15th Mar 2006 16:35 UTC
Databases "Within the past two years, Oracle, IBM and Microsoft have all released freely available versions of their flagship database servers, a move that would have been unheard of just a few years ago. While their respective representatives would argue the move was made in order to better accommodate the needs of all users, it's fairly clear that continued pressure from open source alternatives such as MySQL and PostgreSQL have caused these database juggernauts to rethink their strategies within this increasingly competitive market."
Thread beginning with comment 104754
To view parent comment, click here.
To read all comments associated with this story, please click here.
Member since:

The amount of data is no problem for PostgreSQL, however:
- PostgreSQL needs to analyse its data for statistics regularily. This can take a lot of time if you have a large database.
- Query optimization becomes very important for large data. The PostgreSQL query optimizer often gets it right, but sometimes gets it wrong. In such cases you must manually tweak queries so the optimizer gets it right. This occur s especially when you use a lot of views or subqueries.
- Aggregates in PostgreSQL can be slow, i.e. don't try count(*) on a result with 5 million records.

Reply Parent Score: 1