DB2 SQL Error: SQLCODE=-551, SQLSTATE=42501 and group permissions

Revised 13 Mar 2014

The Situation

A Groovy script that executes some SQL against a DB2 for LUW 9.7 database throws an exception:


IBM states that SQLCODE -551 means:

authorization-ID does not have the required authorization or privilege to perform operation operation on object object-name.

Initially, this seems to be a pretty straightforward problem. This is a database that was recently copied from another system. It is not unlikely that when it was restored, permissions were not granted to USERID to execute SCHEMA.GET_FACE_AMOUNT.

However, once the database administrator demonstrates that he can execute the GET_FACE_AMOUNT function while logged on as USERID, we’re left with a puzzle.

What We Learned

Our DB2 permissions were applied to user groups. The executing userid is a member of the ‘rw’ group, which should have read, write and execute permissions to the GET_FACE_AMOUNT function. As it turns out, however, DB2 does not apply group permissions to user defined functions.

It’s a much better practice to manage database permissions using roles. I’m not a DB2 DBA, so I can’t comment on exactly what the differences between groups and roles are. However, it seems like roles are a more robust way of defining permissions for a collection of users.


#db2, #db2-sql-error, #sql, #sql0551n, #sqlcode-551, #sqlstate-42501

Bulk Insert With Grails

Importing data from a text or CSV file into a database with Grails seems like it should be a simple thing. As it turns out, however, some approaches perform dramatically better than others.

Simple GORM

Let’s assume that we’ve created a Grails domain class called MyDomain to represent the table we’re importing into, and a parseLine() method to parse one line of file data (CSV or whatever) into values. You might try something like this:

void loadTable(File file) {
file.eachLine() { line ->
MyDomain dom = new MyDomain(parseLine(line))

Unfortunately, you would find this code to run very slowly and you might even run out of memory. This is because Grails is caching at two different levels, causing locking and a potential memory leak. Read more about it:

So it’s better to try a different approach. There are many possibilities.

Call an External Tool

If your database platform offers a command line bulk import tool, you could simply call and execute it outside of Grails.

String command = """
db2 LOAD CLIENT FROM '/home/me/data.txt' OF ASC 
(1 9, 10 10, 11 19, 20 20, 21 26, 27 35, 36 71, 72 107, 108 127, 128 129, 
130 134, 135 138, 139 141, 142 144, 145 148, 149 149, 150 150, 151 155)
NULL INDICATORS (0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)

But you may not have such a command line tool, or it may not be available to you in all environments. And it’s probably not a good idea to give your web server the authority to run database commands from the shell.

Groovy SQL

Another approach might be to bypass Grails’ domain classes and the underlying Hibernate persistence layer and instead use Groovy Sql.

void loadTable(File file) {
def db = grailsApplication.config.dataSource
def sql = Sql.newInstance(db.url, db.user, db.password, db.driver)
file.eachLine() { line ->
sql.execute( "insert into my_table (field0, field1, field2, ...) values (?, ?, ?, ...)".toString(), parseLine(line))

Or for better performance, in Groovy 1.8.1+ (Grails 2.0.1+), you can batch the inserts:

void loadTable(File file) {
def db = grailsApplication.config.dataSource
def sql = Sql.newInstance(db.url, db.user, db.password, db.driver)
sql.withBatch(20, "insert into my_table (field0, field1, field2, ...) values (:val0, :val1, :val2, ...)".toString() { preparedStatement ->
file.eachLine() { line ->
def fields = parseLine(line)
preparedStatement.addBatch(val0:fields[0], val1:fields[1], val2:fields[2], ...)

Hibernate StatelessSession

If you prefer to stick with GORM, you can. You’ll just need to compensate for Grails’ and Hibernate’s caching mechanisms, by pausing to clear the caches after every 20 records or so, as described in Burt Beckwith’s article on validation caching and Hibernate’s documentation on batch processing. Or instead of having to worry about clearing the Hibernate cache, you can open a stateless database session, which does not cache, for the duration of your import.

void loadTable(File file) {
StatelessSession session = sessionFactory.openStatelessSession()
Transaction tx = session.beginTransaction()
file.eachLine() {line ->
MyDomain dom = new MyDomain(parseLine(line))

I have not benchmarked any of these approaches to see which performs the best. You will probably have your own reasons to prefer one over the other.

#grails, #groovy, #hibernate, #insert, #sql