Situation
A Ruby on Rails project I’m working on during my day-time job (as if I have a night-time one) has a couple of tables with fields that are changed by external processes. (note: it’s a legacy oracle database that’s had some slight mods where possible to make it more Rails-y). The bulk of fields in these tables however contain data that can be changed using a Rails interface I’ve build. These not to be updated fields need to be initialized when writing a new record and should be read (to be displayed) as normal fields.
So what’s the problem then?
Normally you’d just issue an update to only modify the needed fields, but at the moment ActiveRecord can only alter records by doing a read followed by an update of all fields (please correct me if I’m wrong).
This poses the problem that some fields might be changed by an external process after the read and before the write, resulting in ActiveRecord writing the old data back to the table. (resulting in a whole bunch of unwanted, very evil things of which I really don’t even want to think about fixing)
Solution
I found a workaround by overriding update_attributes in my model as
follows:
def update_attributes(attributes)
@attributes.delete('fieldthatshouldnotbechanged1')
@attributes.delete('fieldthatshouldnotbechanged2')
@attributes.delete('fieldthatshouldnotbechanged3')
super(attributes)
end
This effectively deletes the fields from the hash that’s used to construct the update statement. You’ll see in your development.log that the issued update statement doesn’t contain the deleted fields anymore.
I only tried it with update_attributes since that’s what I used to update the fields.
I hope this doesn’t cause any unwanted issues, but if it does I’ll post about it here.
May 11, 2010 at 12:49 am
Thanks for the input on modifying update_attributes. I had a problem where whiny nil’s couldn’t delete a nil object. I added a condition checking for that and it worked like a charm.