Grails cascade validation for POGOs

Recently, I have been working on some code that does not use any of the GORM capabilities in Grails, but instead controllers call to a service layer which interacts with web services hosted on an ESB. The domain classes that are in use are not understood by Grails to be the domain artefact type and as such, do not inherit certain traits. One of the traits that is missing is the ability to cascade validation from the object being validated down to any child objects. Let me illustrate.

Let’s say that we have a class Person which has some properties. One of these is a residential address, which is of type Address.

import grails.validation.Validateable

@Validateable
class Person {
    String firstName
    String lastName
    Address residenceAddress

    static constraints = {
        firstName  blank: false //other constraints
        lastName   blank: false //other constraints
    }
}

@Validateable
class Address {
    String line1
    String line2
    String city
    String state
    String postalCode

    static constraints = {
        line1       blank: false //other constraints
        city        blank: false //other constraints
        state       blank: false //other constraints
        postalCode  blank: false //other constraints
    }
}

Since the Person class is a POGO and not a Grails domain artefact, the first thing we need to do to make this class capable of being validated during Grails controller binding is to mark the class with @Validateable. Then we can add the static constraints block and it will be used to evaluate/validate the properties during binding.

Issues arise when we need to validate the residenceAddress property of Person. How can we accomplish this? One way to do this is to manually validate the properties, and report any errors.

class MyController {
    // ...
    def personSubmit(Person p) {
        // ...
        // p is already validated, except for residenceAddress
        def anyErrors = p.hasErrors() ||
                        !p.residenceAddress.validate()
        if(anyErrors) {
            // report errors ...
        }
    }
}

The problem with the above approach is two-fold. First, it doesn’t scale. We shouldn’t be validating the parent object and then again validating a child property. Imagine if this class had multiple child properties that were themselves object that again contained properties that were objects. What a nightmare to validate. Second, since you are manually invoking the validate method out of sequence with what is in the constraints block, any errors are reported back within their own object and there is no way to know what the original sequence was for the errors. This is particularly annoying when you need to display errors to a user that were part of a form submission and they are listed out in an order inconsistent with the layout of the form.

So, I decided that I would write some code that would would provide the functionality that is missing with POGOs.

import org.codehaus.groovy.grails.validation.*
import org.springframework.validation.*

class CascadeValidationConstraint extends AbstractVetoingConstraint {

    public static final String NAME = "cascadeValidation"

    @Override
    String getName() {
        NAME
    }

    @Override
    boolean supports(Class type) {
        true
    }

    @Override
    public void setParameter(Object constraintParameter) {
        if (!(constraintParameter instanceof Boolean)) {
            throw new IllegalArgumentException(
                """Parameter for constraint [$name] of
                   property [$constraintPropertyName]
                   of class [$constraintOwningClass]
                   must be a Boolean
                """
            )
        }
        super.setParameter(constraintParameter)
    }

    @Override
    protected boolean skipNullValues() {
        return true
    }

    @Override
    protected boolean processValidateWithVetoing(
            Object target, Object propertyValue,
            Errors errors) {
        if (!propertyValue.validate()) {
            propertyValue.errors.fieldErrors.each {
                String field = "${propertyName}.${it.field}"
                def fieldError = new FieldError(
                    target.errors.objectName,
                    field,
                    it.rejectedValue,
                    it.bindingFailure,
                    it.codes,
                    it.arguments,
                    it.defaultMessage
                )
                errors.addError(fieldError)
            }
            return false
        }
        return true
    }
}

What the above constraint does is validate the given property, and if it doesn’t pass validation according to the constraints block defined on that class, adds each error as a FieldError to the property’s parent object at the correct field location. Then you just need to register your custom constraint. You can do this in Config.groovy, Bootstrap.groovy, or during plug-in initialization if you build this into a plug-in.

import org.codehaus.groovy.grails.validation

ConstrainedProperty.registerNewConstraint(
    CascadeValidationConstraint.NAME,
    CascadeValidationConstraint.class
)

Now our Person class can be defined as follows and all validations are handled as expected (note the highlighted line below).

import grails.validation.Validateable

@Validateable
class Person {
    String firstName
    String lastName
    Address residenceAddress

    static constraints = {
        firstName        blank: false //other constraints
        lastName         blank: false //other constraints
        residenceAddress cascadeValidation: true
    }
}

@Validateable
class Address {
    String line1
    String line2
    String city
    String state
    String postalCode

    static constraints = {
        line1       blank: false //other constraints
        city        blank: false //other constraints
        state       blank: false //other constraints
        postalCode  blank: false //other constraints
    }
}
Advertisements
Posted in Grails | Tagged , , , | 10 Comments

The Agile Mindset

I was thinking back on an interaction I had with one of my co-workers last week, and how that made me realize that I have personally had a shift with how I identify myself at work.

This co-worker is relatively new to the company, only having been here for a few weeks, and I hadn’t had opportunity to introduce myself. The other piece of background here is to know that only part of our organization has operated under an Agile framework; the rest is still using a “traditional” system. We had the usual exchange of names. But when they asked me who I worked for, my first response is what made me have the epiphany. I didn’t say my manager’s name, I didn’t say “development” or even the project I am working on. I said “the Honey Badgers”, which is the name we have given to our Scrum team that started together back in 02/2012.

If asked that question before 02/2012, I can almost guarantee you I would have said “development” or “I’m a developer” or something to that effect. This is the change in mindset I am referring to. If you are like me, you will no longer find yourself identifying solely with your “silo” department; you will identify with the team you have been a part of and built a relationship with over the course of time. This is to take nothing away from my fellow developers or my respect for them. I actually think it is a good thing for the team and the organization when you start to identify yourself this way. It shows that you have invested yourself in your team and have dedicated yourself to the success of what your team produces.

What do you think? If you haven transitioned towards an Agile team from some other sort of management, have you also have this same shift in your mindset?

Posted in Agile, Scrum | Tagged , , , , | Leave a comment

Issue with Tomcat deployment after Grails upgrade 2.0.1 -> 2.2.1

I have been in the process of upgrading all of our existing Grails applications from version 2.0.1 to version 2.2.1. On one particular project I upgraded earlier today, everything worked fine until I attempted to deploy the application to our dev. integration environment. At that point, Tomcat throw the following error when attempting to deploy:

java.lang.NoClassDefFoundError: org/apache/tomcat/PeriodicEventListener

After some research, I looked again and realized that indeed the upgrade script had left the Tomcat plug-in installed via application.properties instead of placing a reference into BuildConfig.groovy, which is the recommended approach. (It also re-installed Hibernate, which we don’t use on this particular application. The upgrade script ALWAYS installs Hibernate!). Here is the correct BuuildConfig.groovy entry:

plugins {
   // other plug-ins
   build ":tomcat:$grailsVersion"
   // ...
}

Once the deployment had failed via the Tomcat Manager, the application was no longer listed but it still resided on the filesystem, and later deployments could not overwrite the files. Here are the steps I had to take to resolve the deployment issue:

  1. Deleted the WAR and context directories off of the filesystem
  2. Restarted Tomcat
  3. Verified the application was no longer in filesystem and no longer listed in Tomcat manager
  4. Checked the application into source control with the correct plug-in reference, which triggered CI build and deploy to dev. integration
  5. Smoke test
Posted in Deployments, Grails | Tagged , , , , , , , , , | 1 Comment

Issue with inner classes in Grails 2.2.0

I was working on a controller in a Grails 2.2.0 project, and it was a Groovy class structured something like this:

package mypackage;

class MyController {
  // controller methods/closures/etc.
  // uses MyControllerCommand in a POST
}

class MyControllerCommand {
  // properties/methods/constraints/etc.
}

I have used this pattern previously in Grails 1.3.7 and Grails 2.0.1, however when I invoked the closure which used the command object, I got the following error:

errors.GrailsExceptionResolver ERROR VerifyError occurred when processing request: [POST] /MyApp/myController/myMethod
(class: mypackage/MyController$MyControllerCommand, method: jsonHeader signature: (Ljava/lang/Object;)V) Incompatible object argument for function call. Stacktrace follows: ...

After some head scratching and then remembering something my colleague said, I moved the command object out to it’s own Groovy file and the problem went away. After some more searching, I found this error in Grails related to using an anonymous inner class in a controller.

But how is this related to the code above? That’s not an inner class, right? Ah, but it is. Remember how I said this was a Groovy class that housed the controller? When there are multiple class definitions in a single Groovy file, groovyc creates the equivalent of Java inner classes. You can verify this by checking the target directory after compiling your Grails project. In this instance, you would find:
target/classes/mypackage/MyController$MyControllerCommand.class

Posted in Grails, Groovy | Tagged , , , , , , , , | 5 Comments

JAX-WS and annotation overload

I have been utilizing JAX-WS to generate web service clients in my current Grails project. It is the best I have found so far in consuming SOAP-based web services within a Grails project. I do have one beef with the library though: annotation overload! I am not a huge fan of the overuse of annotations that has happened over the last few years. JAX-WS seems to want to take this to a new level. For the most part the annotations do seem to stay out of the way, except for one scenario which we have in many of our XSD schemas: <xsd:any>.

For example, you have defined a complex type where one of the elements is defined to be <xsd:any>, so that different XML documents can be inserted at that element. The problem is that JAX-WS wants to know all of the possible types at compile-time via annotations! If you don’t provide the run-time classe during compile-time, you will end up with a message liek this when you attempt to marshall the XML:


class example.package.MyType nor any of its super class is known to this context

This breaks all of the abstraction and defining the implementation at run-time. I have not found a simple way to get around this and still add classes to the JAXBContext like I used to be able to do. We have been able to work with it for the most part, but if anyone know of an easy way to add classes to the JAX-WS JAXBContext at run-time, feel free to let me know!

Posted in Grails | Tagged , , , | Leave a comment

Using GrailsApplication in UrlMappings.groovy

As of the current version of Grails we are using (2.0.1), the grailsApplication bean is not available for use in the UrlMappings class. However, the UrlMappingsHolderFactoryBean implements GrailsApplicationAware, so we can call getGrailsApplication() to do things such as base or mappings conditional on the configuration, like as follows:

class UrlMappings {
	
	static excludes = [...]

	static mappings = {
		if(getGrailsApplication().config.doA) {
			"/**" (view: '/a')
		} else {
			"/**" (view: '/b')
		}
	}
}
Posted in Grails, Groovy | Tagged , , , | Leave a comment

Convert a 32-bit JIRA install to 64-bit

Along with being a software developer on our project, I have also become the unofficial tools administrator. We utilize many of the Atlassian tools, among them being JIRA. When JIRA was installed, it was put an a Windows Server 2008 R2 server, but was installed with the JIRA Standalone 32-bit installer. Due to the Windows memory addressing space when processes are running 32-bit, we could only allocate ~1GB of heap space to the JVM. Any more and the JVM would fail to start. Any less and the process would throw a java.lang.OutOf Memory error before it even started fully. Even at the magical 1GB heap size, we would only run stable for 0.5-1.0 day before we would blow the heap.

I contacted Atlassian support about getting our install running as a 64-bit process on a 64-bit JVM and their response was “Create a new database schema, install the 64-bit JIRA standalone pointing at that new schema and then migrate your data from the old schema to the new schema. No, not happening. I did some digging and figured out how to get the 32-bit install to run as a 64-bit process. Once I did this, the process started as a 64-bit Windows process and I was able to allocate more memory to the heap.

  1. Put a 64-bit JVM on the system, if it does not already have one.
  2. Change the Tomcat process properties to point to the 64-bit JVM.
  3. Rename tomcat.exe to tomcat.exe.x86
  4. Rename tomcat.exe.x64 to tomcat.exe
Posted in Java | Tagged , , , , , , | 5 Comments