From experience I’ve learned to always strive to decrease the number of database queries, since each query usually creates it’s own connection to the database and each connection takes time and resources wo kann ich warzoneen. Lucky for me Symfony (and Doctrine) does this for me, but even before I used Symfony, or on projects where I don’t use it, I always built a small tool that will help me keep track of all queries facebook video mit iphoneen. (simply wrap all queries in a wrapper function and log the sql commando as well as the needed time in micro seconds, simply register a shutdown function to dump the data somwhere in the footer) musik legal herunterladen kostenlos.

So back to the subject here. Sometimes when you use Symfony and Doctrine you may notice that the amount of queries can rise pretty quickly. Sometimes this only becomes apparent after adding enough dummy data, but a good programmer knows where the weak points are kann man windows 10en. There are three major pitfalls here. I’ll list each with the solution.

1. Using related data.

In Doctrine it’s easy to create a relation in your database schema firefox download kostenlos deutsch windows 7. It’s also a very good practice, but you have to think when using it. Say I have a table called Objects, and a table called ObjectData. I put the proper relation on ObjectData (relations: { Object: { local: object_id, foreign: id } } windows 8 herunterladen kostenlos. Next thing I know I’m getting a collection of ObjectData (ObjectDataTable::getInstance()->findByObjectId($id)) and want to know the related object name on a record ($record->getObject()->getName()) in a loop…

That’s the point where it goes wrong, but it’s sneaky, because you’ll notice it only in the number of queries used when your objectData grows herunterladen.

The solution?

ObjectDataTable::getInstance()->create('a')->innerJoin('a.Object o')->where('object_id = ?', $id)

The code is just an example, but the important point here is to add the innerJoin (or leftJoin) statement that will link the tables, thus turning those 100s of queries into one single joined query herunterladen!

2. Deleting stuff

Another flaw is deleting stuff. Did you know you could delete an entire collection? Well sometimes you might need it: ObjectDataTable::getInstance()->findByObjectId($id)->delete(); musikst├╝cke downloaden.

Now that seems nice and easy, but if you look at the number of queries, you’ll notice that this will delete every record seperately by id jumpin jack spiel kostenlos downloaden. That’s not what I wanted..

The solution?

There’s two solutions here, but in this case I’ll go for the one I’m most sure of: using basic DQL.

Doctrine_Query::create()->delete()->from('ObjectData')->where('object_id = ?', $this->object->getId())->execute();

For the other solution you’ll have to read on, as the same method I used for the next one can also be applied on this issue.

3. multiple INSERT’s

Did you ever need to do multiple inserts. I just did again to save data on generic checkboxes. Delete them all and insert all ‘on’ checkboxes. It took me a while to find out how best to handle those big lists of INSERT statements that come from using something like this: $o = new ObjectData; $o->setValue($val); $o->save(); inside a loop of checkboxes.

The solution?

MySQL has a nifty solution for adding multiple VALUES to a single INSERT. I thought to use that, until I came across this thread:

The conclusion here is that using multiple VALUES inside your INSERT is something you should not try with InnoDB (the standard for Doctrine/Symfony, and almost always your best database engine for any project). The better solution is to wrap your INSERT statements into a single transaction:

$conn = Doctrine_Manager::getInstance()->getCurrentConnection();

try {
foreach ($data as $val) {
$o = new ObjectData; $o->setValue($val); $o->save();
} catch(Doctrine_Exception $e) {

The rollback might not be necessary in this case, but you’ll never know if the database server crashes mid-insert, so why not. The important thing is that the whole transaction will be done in a single database connection, making the overhead of multiple inserts a minor issue. This method could also be used to handle the deleting of a collection.