It’s okay to make mistakes when you are up to building simple applications. But for mission critical applications like Banking/Payment Processors, you gotta be very careful and keep listening to seniors and experts in critical matters.
Lets say you are working on a feature to update the total amount deposited in a particular account. Simple code will do like
# just after amount is debited from payee's account def credit_receiver_account(debited_amount) update_attribute :total_amount, total + debited_amount end
What’s wrong with the code above?
Consider that the receiver account belongs to a conglomerate like google who keeps receiving payments every milliseconds. So, there is always chances that the variable
total contains an older/obsolete data. This can be a chaotic situation. This happens because the ruby object corresponding to the database row can represent different values.
how to avoid such race condition?
Just like to stop the race you stop/pause all others and let only one to run, you will lock the data to maintain consistency.
There are two types of locking – optimistic and pessimistic
What is Optimistic Locking
Optimistic locking allows multiple users to access the same record for edits, and assumes a minimum of conflicts with the data. It does this by checking whether another process has made changes to a record since it was opened, an
ActiveRecord::StaleObjectError exception is thrown if that has occurred and the update is ignored.
Using Pessimistic locking
The race condition can be fixed using Pessimistic Locking technique using the locking feature of the Database. This technique not only protects from stale-object updates but also from concurrent updates(occurring almost simultaneously) in real-world (very rare though).
def credit_receiver_account(debited_amount) self.with_lock do update_attribute :total_amount, total_amount + debited_amount end end