This all works very well; however, it does present a challenge when developing locally: How do we load data that mocks the expected external data? Both projects approached the problem differently, and both approaches provided pros and cons.
Approach 1: Conditionally load the mock data using migrations. On the first project we used PostgreSQL for local development and Oracle in production. Since we used different databases we were able to write our migrations to only load mock data when the database being loaded was a PostgreSQL database. A pro to this approach is that running all the migrations completely sets up the database, local or production. A con to this approach is that more migrations are necessary and some of the migrations lose maintainability based on conditional noise.
Approach 2: Create a rake task to load data that your application does not own. PragDave inspired this approach with his blog entry on how to use Migrations Outside Rails. Using
ActiveRecord::Schema.define
it's possible to easily create the mock tables.ActiveRecord::Schema.define doFollowing table creation, you can load the mock tables by creating models and saving.
create_table books, :force => true do |t|
t.column :title, :integer
t.column :pages, :string
t.column :classification, :integer
end
end
class Book < ActiveRecord::BaseA pro of this implementation is the clear separation between the mock data and the data necessary for running your application. This also helps reduce the risk of loading invalid data in production. A con is that an extra step is required when first setting up or resetting a database. This con can be partially mitigated by creating a task that runs both the migrations and loads the mock data; however, this introduces more code to the codebase and requires developers to remember one more rake task.
end
Book.create(:title=>'RubyFoo', :pages=>300, :classification=>'Technical')
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.