Test Data

At XDC 2019 my session was titled Xojo Design Mistakes (the alternate but way longer title was ‘Thankfully time travel doesn’t exist or my future self might travel back and murder my younger self for the stupid coding mistakes I’ve made’).  These are things that I’ve discovered over the years, both in my own projects and in other peoples projects that are just plain wrong or less than ideal.  This will be an on-going series since I had well over 50 slides and left 150 out.  So when I get bored I’ll bring these topics up.

Nearly all of our consulting projects are database driven applications.  It’s why we’ve created the tools to help with these projects like ARGen, which simplifies our interactions with the database, and BKS Shorts, which is our own reporting tool.  These tools are invaluable in getting our work done in a timely matter.

In a database application it’s typical to have a List of something.  A common example of this is a Customers list.  In that list the client typically wants the ability to Create, Read, Update, and Delete (or CRUD) a customer with varying degrees of rules behind it (like do they have permissions to add or delete a customer?).

During development we get the List form going, add the controls to be able to add a new record.  Then we create the Add/Edit form that allows us to test those capabilities.  We create a few, update a few, delete a few customers and then move on.  Maybe the client wants search capabilities so we add that to the List window and when we’ve tested it against our half dozen or so records we move on to the next task.

There is nothing wrong with this process.  It works and it’s fairly efficient as far as it does.  However, there’s one thing we’ve skipped that’s really important but also difficult to achieve.

So far we’ve test with *maybe* a dozen records.  What happens when the client adds 10,000, or 100,000 Customer records?  Does the list form take a long time to load?  Does the search function take a long time?  What about the Customer popup menu’s that you’ve scattered throughout the project – are those now slow, unwieldy, and unusable?

Unfortunately, with the way we implemented the project we don’t know how any of this works since we only have a dozen records.  So it’s really important to have adequate amounts of test data.  Creating 10,000 new customers using your new interface would take a long time.  So what can you do?

There are tools out there that will help generate data sets.  These tools allow you to create thousands, even millions of rows of realistic data.  Randomized male and female first names along with a last names is a great way to generate customer names.  Many tools allow you to add random dates in a range, random IP addresses, random values from a  list you provide and so on.  The sky is the limit when it comes to what sort of data developers need.

Now, when you do your testing you see how your application reacts with a lot of data.  I almost guarantee that it will act different.  Do you need to switch to a data-on-demand listbox?  Do you need to put an index on a common searchable field to speed up indexing?  Do you need to implement Full Text Search in your database?  Having a huge amount of data will answer these questions for you.

I once worked on an accounting application in VB6 where the original database designer using an Access database and did an account balance on the fly iterating through bills, checks, journal entries, etc. With a few thousand rows of data in each table this process took a second or two for all balances on a local machine. When this database was accessed over the network it took 5 to 7 seconds. When we converted our first client database it took 30 to 40 seconds for EACH account! Obviously this was not acceptable performance from an accounting application meant to be used daily by general contractors with hundreds of employees and tens of thousands of customers. The solution was to have a current balance value that was stored and then updated when a transaction occurred. We could have saved ourselves hundreds of hours of rushed development time (and much stress and heartache) if we had tested with large amounts of data much earlier in the process.

I mentioned adding an Index to a field earlier. One word of caution on this: it’s tempting to add an index to every field you’re searching on. Don’t do this! Only added indexes to the most important fields in a table. For a customer maybe the two most important fields are phone number and name even though you search on City and things like that. Indexing is extra work for the database so performance can take a signifiant hit with indexing a field.

Since the toolI’ve been using to create test data is no longer being sold I’m curious what you’d recommend.  Do you have a favorite tool?  Or is this a tool that would be of use to the community?

Happy Coding!

ARGen 3.0.3

BKeeney Software Inc. is proud to announce an update to ARGen, our ActiveRecord Generator utility for Xojo developers. This minor update includes dark mode support, speed improvements, and important updates for generated projects. Updating to 3.0.3 is recommended for all ARGen users.

ARGen is available for macOS and Windows. It can be used for free in limited mode, and is priced at $99.95 to unlock all features. Existing version 2.x users will automatically be provided an upgrade opportunity when launching version 3.

3.0.3 Release Notes:

Changes:

  • Added Dark Mode support
  • Simplified manual relationship management
  • Selecting a different SQLite database now clears the password field
  • kMaxReturn is now a protected constant for cleaner code
  • DBUpdates module code is now cleaner
  • Improved instructions in some locations
  • Base project templates optimized
  • Preferences module no longer writes to SpecialFolder.Preferences
  • iOS Create Data Sources defaults to true
  • Updated links to Xojo documentation
  • Generated localization module constants are now protected

Fixes:

  • DBUpdates.SetDBVersion no longer uses a BKS extension synonym for str()
  • Fixed return statement for iOS apps using 2018r2
  • Projects with empty name now have default save name
  • BKS Created/Modified overrides no longer generate properties that fail to Register
  • Corrected minor UI bug on Windows
  • Project listing loads faster
  • Speed improvements throughout the software
  • Projects created but never saved are no longer retained when closed
  • Checking for updates at launch now works
  • Preferences window will show the last update check time

Pricing, examples, and more details can be found at the project homepage at https://bkeeney.com/argen/ 

Updating Records with Transactions

In my last blog post I talked about inserting records into a SQLite database and found that using transactions makes them incredibly fast.  Does the same hold true for updating an existing record?  We’ll find out.

In this set of tests I inserted 100,000 records and then randomly selected 10,000 records to update with our same random name, age, and date values.  In theory this means I have records distributed all over the range of the database.

In the first test we use a simple RecordSet which means we have to query for it first.  Then call its Edit method, change the values and call update.  It’s a two step process.

For i As Integer = 0 To ari.Ubound
   Dim rs As recordset
   rs = db.SQLSelect("Select * from t_table where table_id = " + Str(ari(i)))
   If db.error Then
      If db.Error Then
         MsgBox "DB Error: " + db.ErrorMessage
         Return
      End
   End
   rs.edit
   rs.Field("Name").stringvalue = GetName
   rs.Field("Age").integervalue = GetAge
   rs.Field("TheDate").DateValue = New date
   rs.Update
   If db.Error Then
      MsgBox "DB Error: " + db.ErrorMessage
      Return
   End
Next

Unsurprisingly, this method took about 49 seconds for 10,000 records.

In test two I switched to using a PreparedStatement to update the record.

Dim sql As String = "Update t_table Set Name = ?, Age=?, TheDate =? WHERE table_id = ?"
Dim ps As SQLitePreparedStatement = db.Prepare(sql)
ps.BindType(0, SQLitePreparedStatement.SQLITE_TEXT)
ps.BindType(1, SQLitePreparedStatement.SQLITE_INTEGER)
ps.BindType(2, SQLitePreparedStatement.SQLITE_TEXT)
ps.BindType(3, SQLitePreparedStatement.SQLITE_INT64)
For i As Integer = 0 To ari.Ubound
   ps.Bind(0, GetName)
   ps.bind(1, GetAge)
   ps.Bind(2, New Date)
   ps.bind(3, ari(i) )
   ps.SQLExecute
   If db.Error Then
      MsgBox "DB Error: " + db.ErrorMessage
      Return
   End
Next

When run for 10,000 records this takes about 47 seconds so there is a little bit of time savings by not querying the database for the data to begin with.  It’s also possible that the PreparedStatements aren’t nearly as efficient as one would hope.

I then repeated the same tests within a Transaction.  In both cases updating records effectively took no time at all!  If I increase the number of Updates to 10,000 both the RecordSet update and the PreparedStatement takes about 8 seconds.  

I’m surprised at these results because you would think that doing the query required for the RecordSet would take more time but it’s no different than using the PreparedStatement SQL Update.  Why is that?  I have a couple of theories.  First, the table isn’t very complex with only 4 columns and I’m only ever querying and updating based on the primary key.  I’m sure a more complex query would slow things down.  We also have no indexes on any the columns which can cause the database to do more work behind the scenes.

Regardless, we learned a few things today.  First, database transactions are critical to large scale database manipulation in a timely manner.  Second, it doesn’t matter if you use RecordSet Edit/Update or if you use PreparedStatements as either one is fast enough.

What sorts of questions do you have about databases that I can test for you?

Database Transactions in Xojo

Every now and then someone on the Xojo forum wonders why inserting data into an SQLite database is so slow.  Databases are designed to hold billions of records and do it fast, right?  So why is their application so slow?  The answer is they are relying upon the built-in transaction in Xojo.

By default SQLite databases do an automatic transaction for you.  This means that as soon as you attempt to insert, update, or delete data the work to write that change to disk happens as soon as possible.

For example, let’s take the following bit of code to insert data into a table:

For i As Integer = 1 To kMax
   Dim dbr As New DatabaseRecord
   dbr.Column("Name") = GetName
   dbr.IntegerColumn("Age") = GetAge
   dbr.DateColumn("TheDate") = New date
   db.InsertRecord "t_table", dbr
   If db.Error Then
      MsgBox "DB Error: " + db.ErrorMessage
      Return
   End
Next

GetName gets a random name from an array that has 26 names in it and adds a random integer between 1000 and 10000.  GetAge returns a random integer between 1 and 74. 

This is pretty simple insert and if you run this 10,000 times it takes roughly 49 seconds saving to a desktop SQLite file on my 5k iMac.  And in so doing the application is locked up for that entire time because I’m in the tight loop.  This is simply unacceptable.

I’m sure someone is screaming why are you using the DatabaseRecord!  It’s slow!  It’s inefficient!  You should be using PreparedStatement’s!  Okay, so using pretty much the same logic:

Dim sql As String = "Insert into t_table (Name, Age, TheDate) Values(?, ?, ?);"
Dim ps As SQLitePreparedStatement = db.Prepare(sql)
ps.BindType(0, SQLitePreparedStatement.SQLITE_TEXT)
ps.BindType(1, SQLitePreparedStatement.SQLITE_INTEGER)
ps.BindType(2, SQLitePreparedStatement.SQLITE_TEXT)
For i As Integer = 1 To kMax
   ps.Bind(0, GetName)
   ps.bind(1, GetAge)
   ps.Bind(2, New Date)
   ps.SQLExecute
   If db.Error Then
      MsgBox "DB Error: " + db.ErrorMessage
      Return
   End
Next

This takes about 48 seconds.  No big time savings there.  Obviously that’s not the improvement we need.

What we need to do is put these inserts in a database transaction using the following bit of code before the loop starts:

db.SQLExecute("BEGIN TRANSACTION")
If db.Error Then
   MsgBox "DB Error: " + db.ErrorMessage
   Return
End

And then at the end use:

db.Commit

Using the DatabaseRecord method takes a whopping 1 second with 10,000 records.  Using the prepared statement is so fast that my measured elapsed time in seconds is effectively 0.  

If I up the number of records inserted to 1,000,000 I get an interesting result.  The DatabaseRecord method takes 27 seconds where the PreparedStatement method takes 47 seconds.  And if I declare the PreparedStatement inside the loop it now takes 55 seconds.

What have we learned in this blog post?  First, using a database transaction is considerably faster than using the default transaction behavior.  Second, using DatabaseRecord is pretty fast and depending upon the number of record inserted it might be considerably faster.  Honestly, I didn’t expect this.  

In the next blog post I’ll look at Record updates and what are the best methods.

Nil Recordset

The question on how to deal with nil RecordSets is OFTEN asked in the Xojo forums.  Either the original poster or a well meaning responder will do something like this:

dim rs as recordset 
rs = db.sqlselect(sql)
If rs <> Nil Then
     If Not rs.EOF Then
       // No Records in Table or bad querry?
     End If
End If

It works, but it only masks the real issue.  Why is rs nil in the first place?  The answer is because the SQL statement to get the RecordSet had an error!  What is that error?  We have no idea because the code doesn’t ask for the error.

Most of the time the database will give you an error message that will point you in the right direction.  How do you tell that the database had an error?  You check the Error property.

dim rs as recordset 
rs = db.sqlselect(sql)
if db.error then
     //Do something
end


The database can give you an error code but this varies from database to database so I don’t find it exceptionally useful.  However, there is an ErrorMessage property that gives you a human readable message.  

dim rs as recordset 
rs = db.sqlselect(sql)
if db.error then
     MsgBox “DB Error: “ + db.ErrorMessage
     Return
end


In this example I’m showing this to the user but this is probably not a great idea for most uses because you don’t want the average user to know what happened.  However, it’s great while debugging.

The error you get back might be a little mysterious.  SQL is a language specific to databases and the error message returned expects you know a little bit about databases.

The second part of my message to you is that if you’re using SQLSelect or SQLExecute in your code you should look at PreparedStatements.  They’re safer in that PreparedStatement have some security features and does the heavy lifting of taking the input data and putting it into the right format.  What do I mean by this?

In a normal SQL statement you would have an SQL statement like this:

dim sql as string
sql = “Select * from users where last_name = ‘Keeney’;”


This selects all users whose last name is ‘Keeney’.  If you were going to allow the user to search for whatever name they want you would end up doing something like this:

dim sName as string
sName = txtSearchField.Text
sql = “Select * from users where last_name = '“ + sName + “’;”


Note that I had to manually put the single quotes in before and after the name since the database column is text.  This works great until you enter an Irish name like O’Neil.  The database will choke on this because it has an extra apostrophe in the statement.  You can escape this data yourself but what a pain.  Instead, use a PreparedStatement.

dim ps as SQLitePreparedStatement
dim sql as string
sql = “Select * from users where last_name = ?”
ps = db.prepare(sql)
ps.bindtype(0, SQLitePreparedStatement.SQLITE_TEXT
ps.bind(0, txtSearch.text)
dim rs as Recordset
rs = ps.SQLSelect
if db.error then
     MsgBox db.errormessage
     return
end


Nowhere in the code do I have to put in apostrophe’s or escape any data.  It’s just handled and this is true if you have database columns that are text, date, or numeric.

PreparedStatements have a number of advantages besides the automatic formatting.  They help with security and SQL Injection attacks.  If you don’t know what those are just do a web search to learn more.  You can reuse PreparedStatements in code but in my testing it doesn’t save much time as it appears as if the database plugin saves them internally.  Regardless, there are so many reasons to be using PreparedStatements over straight SQL.

To recap:  A nil recordset always mean a database error has happened.  You need to check for for the error after every operation whether that be a Select, Insert, Update, or Delete statement.  Basically after every SQLSelect and SQLExecute statement.  Ideally you want to start moving all of your code to use PreparedStatements because of all of its advantages.

The Xojo Community is Awesome

Have I told you how much I love the Xojo community?  I’ve been part of it for fifteen years and I’ve met hundreds of Xojo developers at developers conferences and probably exchanged emails with thousands more.  I am amazed at how much this community helps each other and I wish there was a way to promote that as a key feature of the product.  It’s a big deal.  Really!

If you’re just starting out using Xojo know that there are a bunch of people, myself included, that are willing to help out, if we can, on your journey.  Programming is hard.  Well, I don’t think it’s hard because I’ve been doing it for so long, but it is complex at times and that makes it hard.  Just ask your question in the Xojo forums and you’ll almost always get an answer within hours.

Even Xojo pros, such as myself, have need of help.  Xojo covers Mac, Windows, Linux desktop, console, and web apps.  It does iOS apps for iPhone and iPad.  It now does Raspberry Pi for heavens sake!  It works with dozens of different databases.  There is simply no way any one person is going to know everything there is to know about Xojo.  It just can’t happen.  So yes, I go to the forums, all the time, and ask for help.

Just the other day I asked for some help with WooCommerce.  Not Xojo related, really, but certainly related to a project we’re working on for a client.  Within a few hours I had half a dozen developers private message me saying they might be able to help.  Subsequent contact narrowed that list down a bit but the point is that I have probably shaved off several days worth of work simply by asking for advice.

I am biased towards Xojo, naturally, as it’s been my primary development language for fifteen years.  I think I’d be hard pressed to find such a friendly community.  I call many on the forums my friends even though I’ve never physically met them.  The few that I’ve met in person have lived up to their forum reputations and are really friends for life.

So maybe this is my belated Thanksgiving post.  I am thankful that so many years ago I jumped both feet first into the tool.  I asked questions – many of the silly and redundant.  I became more proficient and then made another jump to start blogging about it, making products for other developers, and training the next generation of developers.

So if you are in need of a cross-platform development tool I highly recommend Xojo.  It ain’t perfect but no development tool is.  If you jump in I think you’ll love the community.  I know I do.

What say you fellow Xojo developers?

Introducing ARGen 2.0

ARGen More Powerful Than Ever!

BKeeney Software is pleased to announce the release of version 2 of ARGen, our ActiveRecord generator utility.  The new release includes many enhancements.  Some of the highlights are:

  • Can now create User Interface elements.
  • Create entire projects for Desktop and Web projects with the proper database connections for each type.
  • Choose between standard database error reporting and a more robust version that BKeeney Software provides.
  • Can now create foreign key elements automatically.
  • Ability to create relationships without having to put them into foreign keys in the database.
  • Works with more databases.
  • Added ability to use database views.
  • Completely redesigned application!

Purchase Mac Version
Purchase Windows Version

Note:  Without the paid upgrade, you can still access the free version with nag screen.  You can use the free version with no time limitation.  The free version is limited to two tables at a time and will not create any User Interface elements.

If you are an existing user and did not receive an email containing an upgrade coupon code please contact us at support@bkeeney.com.

If you want to see ARGen 2.0 in action, please visit our new video at http://www.bkeeney.com/allproducts/argen/argen-2-0/

Product Home Page:  http://www.bkeeney.com/allproducts/argen/

XojoTalk 027 – Database Goddess

We at BKeeney Software are blessed in so many ways.  All of our employees bring a unique and interesting mix of talents and experiences.  It’s not just about one person and we often bounce ideas off each other to get the best possible result.

In the latest XojoTalk, Paul interviews our CEO, Carol Keeney, and how she uses Xojo and gets her thoughts on databases.    Carol has a ton of project management experience that makes running BKeeney Software easy since we often have a half dozen projects going at a time.

She has a lot of database experience too.  That has given her the “Database Goddess” nickname.  Again, that experience is so helpful for the programmers because, really, you don’t want your programmers designing the database.  We tend to do things the easy way which might not be the right way.

I thought it was an excellent interview.  You can find the XojoTalk at http://blog.xojo.com/2016/07/26/xojotalk-027-database-goddess/

Converting FileMaker to Xojo And ActiveRecord

We are currently converting a FileMaker app to a Xojo web app.  We are about  3/4 of the way through the project and it’s been a surprisingly easy conversion.  Our biggest challenge has been normalizing the database since the original FileMaker developer did some things that were less than ideal.

Hal Gumbert over at Camp Software is starting a series of blogposts on their own transition from FileMaker to Xojo.  It is a recommended read.

One of the big things that many developers want coming from FileMaker and MS Access and other tools where the database is tightly integrated into the development tool is data binding.  It makes for a quick way to load/save data to and from the user interface.  We don’t do data binding and I’ll get into that a bit later.

In Hal’s blog post he goes into the various Xojo options and ActiveRecord is one of them.  I thought I’d spend a little time talking about ActiveRecord to fill you in on what it does.

ActiveRecord eliminates many common mistakes that developers have when creating database applications using Xojo.  How many times have you mistyped a table or field name in an SQL query?  We used to do it a lot and ActiveRecord eliminates much of it.  It does this by creating a NameSpace module and creating a class for each table.  The properties in those classes then map to the field in each table.

A register function for ActiveRecord uses Introspection to ensure you have all of the tables and fields from the database mapped in your classes.  If not, an assertion takes place in debug mode which tells the developer if they’re missing a table, field, or if a field is mapped to the wrong datatype.  This is very handy on large projects where you might be adding a bunch of fields to meet changing conditions and this way you definitely will not forget to add them to the ActiveRecord classes.

Creating the classes can be tedious especially with very large databases.  Our ARGen utility will help generate the classes for your by scanning your database and creating the classes for you.  For some this might seem backwards but we tend to design the database first and then code to it and we find that ARGen does 75% of the repetitive work for us by creating the classes and adding some shared methods to each class that help in queries and finding a particular record.

Once created, using ARGen is fairly simple.  To get a list of records in normal Xojo code you would create a query.  ActiveRecord does something similar using a class shared method.  Here is an example of using the List shared method to load a ListBox:



for each oCompany as Data.T_Company in Data.T_Company.List(sCriteria, sSort, iOffset)
   
   lst.AddRow oCompany.sCompanyNameCurrent,  _
   
   oCompany.sStreet1, oCompany.sCity, _
   
   oCompany.sStateCode, _
   
   oCompany.sZipCode, _
   
   oCompany.sCompanyStatusName, _
   
   oCompany.sAgentName, _
   
   oCompany.sParentName
   
   dim iRow as integer = lst.LastIndex
   
   lst.RowTag(iRow) = oCompany
   
next


Data is the NameSpace and we are calling the T_Company List method and we pass in three parameters.  The first is our search criteria, the second is the sort criteria, and the last is the offset which allows us to ‘page’ our data.  It returns an array of Data.T_Company objects and we simply add what we need to the ListBox and stash the object in the RowTag event.  The best part about this is that AutoComplete in the Xojo code editor will show us the table and field names and we don’t have to remember any of it.

Screen Shot 2015-06-24 at 9.49.32 AM

When we wish to edit the record we grab it from the ListBox.RowTag property and pass it in to our editor.



dim oCompany as Data.T_Company =  lst.RowTag(lst.ListIndex)

dim pg as new pgCompanyDetails

pg.Display oCompany


ActiveRecord doesn’t do data binding.  We simply don’t find it useful for a variety of reasons.  First, to do data binding your need to have controls that can handle the data source.  We could create control subclasses but after working with custom data bindings in Xojo on a project or two I was not happy with the endless tweaking we had to do to get them to work properly.  Maybe someone with more patience that I do will be satisfied with it but I never was.  Plus, most developers I’ve met that have done data binding on large projects remain unsatisfied in some form or another or go to extraordinary lengths to make it ‘easy’ (like having every field be string even for things that should clearly be a numeric data type).

Instead we chose a much simpler route.  In our edit forms we have three methods:  Load, Save, Validate.  We feel this offers us some advantages over binding.  First, everything is local to the window.  We don’t have to go find the subclass that handles the data load, save, and validate.  This lets us customize everything for that particular form.  An example Load method:



Private Sub Load()
   
   if moCompany.IsNew then
      
      lblCompanyID.text = "New"
      
      pmStatus.Enabled = false
      
   else
      
      lblCompanyID.text = moCompany.iCompany_ID.ToString
      
      pmStatus.setid moCompany.iCompanyStatus_ID
      
      pmStatus.Enabled = true
      
   end
   
   //Other code here
   
   if moCompany.IsNew then
      
      ccDatePicker1.dtmSelected = new date
      
   else
      
      ccDatePicker1.dtmSelected = moCompany.dtClientSince
      
   end
   
   txtCompany.text  = moCompany.sCompanyNameCurrent
   
   ccLastModified1.SetRecord moCompany
   
End Sub


Right away we can see that what we load depends if the record is new or existing.  Data binding wouldn’t help us there.  Labels and TextFields are the easies to do data binding with but since you’ll need a TextField to do a numbers only field or a date field you now have to create multiple subclasses.  Probably not a big deal but we’ve found it to be a hassle.  Having everything local means we can handle the edge cases with ease rather than having to modify the control subclass that’s doing the binding.

Before we can call our save method we have a Validate method that simply returns true if everything is okay.  If not, it presents a message to the user:



Private Function Validate() As boolean
   
   SetError ""
   
   if txtCompany.text.trim = "" then
      
      seterror "Validation Error.  Company name cannot be blank."
      
      txtCompany.SetFocus
      
      return false
      
   end
   
   if Data.T_Company.IsDuplicate(txtCompany.text.trim, moCompany.ID) then
      
      seterror "Validation Error. That Company name is already in use."
      
      txtCompany.SetFocus
      
      return false
      
   end
   
   return true
   
End Function


Then finally in our Save method we load data from the controls into the object for saving:



Private Sub Save()
   
   moCompany.CompanyStatus pmStatus.RowTag(pmStatus.ListIndex)
   
   moCompany.dtClientSince = ccDatePicker1.dtmSelected
   
   moCompany.sCompanyNameCurrent = txtCompany.text
   
   moCompany.iCompanyEmployeeCount = txtNumberOfEmployees.text.val
   
   moCompany.SICCode ccSic1.SICcode
   
   moCompany.sURL = txtWebSite.text
   
   moCompany.sTaxIDNumber = txtTaxID.text
   
   moCompany.bInactive = chkInactive.Value
   
   moCompany.save
   
End Sub


Note that our save method doesn’t care if it’s a new or existing record.  Behind the scenes ActiveRecord does the appropriate Insert or Update prepared statements.

Every place where we are editing data we have these three Load, Save, Validate methods.  Everyone on our team knows to look for those so it’s very easy for our team to work on projects collaboratively and know pretty much what’s going on.

Could ActiveRecord do data binding?  Sure.  The classes are open source so feel free to modify them to your hearts content but I truly believe it’s more a matter of the controls being the real pain.

ActiveRecord has a number of events that are handy to use.  We track who created and who changed the records using 4 fields on each table CreatedDate, ModifiedDate, CreatedByID, and ModifiedByID.  We add the BeforeCreate and BeforeUpdate events.  For example, the BeforeCreate event looks like this:



Sub BeforeCreate()
   
   dtCreatedDate = new date
   
   if session.oUser <> nil then
      
      iCreatedBy = Session.ouser.iUser_ID
      
   end
   
End Sub


This gets called before we save anything so the class properties get modified before we attempt to save.  In many projects we have an audit trail to know who changed what data so we add the AfterCreate and AfterSave events of Data.T_Company and pass the entire object into the Audit table:



Sub AfterCreate()
   
   dim oAudit as Data.T_Audit = Data.T_Audit.AuditAdd(self)
   
   oAudit.iCompany_ID = self.id
   
   oAudit.Save
   
End Sub


Then it’s up to the Audit class to query the ActiveRecord class to find changed data and put that into its table.  Again, the code to do this is one one spot rather than all over the project.

I could spend hours talking about ActiveRecord as we tend to use on all of our new database projects.  It speeds up development of database applications.  It eliminates many of the common errors.  It tends to force most database code into the NameSpace classes.  And the compiler can warn you if you’re doing bad things with data.

ActiveRecord is not for EVERY project but we’ve found it incredibly useful in our consulting.  If you dread doing a database project because of the tediousness of database coding then perhaps ActiveRecord is for you.

We recently did a webinar with Xojo on ActiveRecord.  You can view it at http://developer.xojo.com/webinar-simplying-db-access-with-bkeeney-activerecord.  ActiveRecord itself is open source.  ARGen is $19.95.  We also use ActiveRecord in one of start to finish training projects at our training site called Link Share.

ActiveRecord home page 

ARGen home page 

Xojo Training Site

Database Field Madness!

Two clients in the last several weeks have shown up with the exact same issue and it’s time to talk about it.  In both cases the clients had a field in a table that could contain multiple sets of data.  This is a really bad idea and if you find yourself doing it…well…stop it!

In the first case their company table has an email field and at some point in their past they decided that some users needed two email addresses.  Instead of creating an additional field for an alternate email address they simply decided to concatenate the data into one string separated by a comma.

To be honest I’m not sure why it sounded better to concatenate the data rather than create a second field but that’s what they did.  This decision was made years ago, of course, by a developer no longer working for the company, but it’s now up to me to ‘fix’ it.

With emails being prolific it might make more sense to have an email specific table that tie to the customer and can be inactivated.  This solves a number of issues.  It lets users have more than one email addresses.  The other thing it does is keep a history of the customer email addresses so if you are trying to verify a user account via the phone it might be a way to verify their identify when all else fails (not that I’d solely use that).

In the second case the customer table has several fields that can grow over time.  One field has notes separated by carriage returns, another field has billing data and another has payment data and each has ‘records’ in that single field separated by carriage returns and the data in each record separated by pipes.

Example:  The notes field for a single customer might be something like this:

6/20/2010 This is a note

6/30/2010 This is another note

7/5/2014 This is another note that could be pages long.

If you find yourself designing your database like this STOP right now and step away from the keyboard!  Databases are really good at having tables with related data.  Your notes, bills, and payments tables would all have a foreign key references back to the customer table.  That way you can have as many of those children tables as needed without affecting the customer table.  Having fields that grow exponentially in a single record is a bad thing.

Another issue that I see a lot is that you, the programmer, should never, ever, generate your own record id’s.  Let the database do that unless you have some really special-use scenario where you can do a better job of it.  To be clear, I’ve never seen this scenario.  Instead, your primary key should be an auto-incrementing integer and is something that you should never be able to modify.  That’s not to say that you can’t create your own ‘human readable’ code but it should never be used as the primary key of your table.

Another thing, take the time to use the native data types for the database.  If it’s an integer use an integer field type.  If it’s a date then use a Date or DateTime field type.  Booleans, if not a native data type for the database you’re using, can be a TinyInt with a length of 1.  Your conversion to and from Xojo will not be an issue and you let the database do a tremendous amount of work for you.  One client had an Amount field set up as string.  To get a total amount they had to load the entire recordset in, loop through it, convert the string to a double and keep a running total.  Instead, they could have done a simple Sum in an SQL statement and let the database do all the work!  Trust me, it’s much faster that way.

Those are my database hot button topics.  My DBA wife (i.e. The Database Goddess) has her own hot button topics and has beaten them out of me convinced me of the error of my ways and I no longer do them (or at least not without a reprimand).

In most cases the clients didn’t know any better and I’m sure at some point in my distant software development past I did some silly things like that too.  Those silly bad habits were beaten out of me after several accounting projects where database speed was essential.

What sorts of database blunders have you seen that now drive you nuts?