Pages

Saturday, December 19, 2015

Scribble 0.3.0

I am proud to announce a new version of the the Scribble testing library! The biggest changes are the new modularization and documentation. For every functional aspect there is now a separate module so that not a whole load of unused dependencies have to be included in your project if you only require just a single functional aspect. In addition to this, the entire project documentation is now kept in the source and be generated using maven's site support. This includes this wiki documentation as well, although the publishing process is not yet part of the release build jobs.
As new features for testing I introduce a http server as a TestRule that can be set up in various ways to server static content. It's still rather limited, but will be contiuously improved in future releases. Further features are the possibility to create temporary zip files, record system out and err via a TestRule and capture and restore System Properties - a simple rule that helps keeping the test environment clean, and finally a matcher for matching date strings against a data format.

For more information, have a look at the wiki or find the source code on GitHub

Task

Story

  • [SCRIB-35] - Embedd static HTTP content as a rule
  • [SCRIB-43] - Build documentation as part of the release
  • [SCRIB-49] - Create zipped temp file from resources
  • [SCRIB-50] - Date Format Matcher
  • [SCRIB-52] - Rule for capturing System.out and System.err
  • [SCRIB-53] - Rule for setting and restoring System Properties

Bug

  • [SCRIB-39] - ConfigPropertyInjection#isMatching sets default value
  • [SCRIB-51] - TemporaryFile not usable as ClassRule
  • [SCRIB-57] - ApacheDS all prevents exclusion of modules
  • [SCRIB-58] - Remove SLF4J Binding dependencies
  • [SCRIB-59] - DirectoryServer/DirectoryService not working as ClassRule

Wednesday, July 8, 2015

Scribble Release 0.2.0

I am proud to announce a new version of the the Scribble testing library! The new version has support for an embedded ldap server which allows to write tests against an ldap server without having to rely on existing infrastructure. Further, the JCR support has been improved, now it's possible to pre-initialize a JCR repository with content from a descriptor file and to create a security-enabled in-memory repository. Some additional improvements have been made in the CDI injection support and the matchers have been extended for availability checks for URLs.

For more information, have a look at the wiki or find the source code on GitHub.

Release Notes - Scribble - Version 0.2.0

Bug

  • [SCRIB-31] - Primitive types not support for ConfigProperty injection
  • [SCRIB-32] - String to Number conversion of default values in ConfigProperty injection fails
  • [SCRIB-41] - LDAP Rules are not properly applied
  • [SCRIB-42] - ResourceAvailabilityMatcher is not compatible with URL
  • [SCRIB-48] - Directory Rules can not be used as ClassRules

Story

  • [SCRIB-1] - Builder support for LDAP Server and Service
  • [SCRIB-2] - Make LDAP Port configurable
  • [SCRIB-5] - Matchers for availability of an URL
  • [SCRIB-10] - Support for prepared JCR Content
  • [SCRIB-12] - Support security enabled content repositories
  • [SCRIB-14] - Add Convenience method for admin login
  • [SCRIB-33] - Convenience Methods for Directory creation
  • [SCRIB-34] - Convenience Method for anonymous login
  • [SCRIB-38] - Supply package-info.java

Friday, May 29, 2015

Multi-Module Integration Test Coverage with Jacoco and Sonar

Yesterday I have struggled to capture IT coverage results in a multi-module project setup, which I eventually solved.

So lets assume, I have the following setup:

rootModule
+Module1
+Module2
| +SubModule2-1
|    +SubModule2-1-1
| +SubModule2-2
+ITModule
  +ITModule1

The ITModule contains only integration tests, where ITModule1 is a special scenario, that requires a single module. Module2 consists of nested submodules. There are several examples out there to use a path like ../target/jacoco-it.exec but that's obviously not working if you more than one nesting level.

To know how to solve it, you must understand, how sonar is doing the analysis. When analysing the coverage information sonar checks the code of each module against the coverage file that is specified in the sonar.jacoco.itReportPath property which defaults to target/jacoco-it.exec. So when analyzing Module1 it check for coverage info in Module1/target/jacoco-it.exec. But as the coverage data is captured in the ITModule, respectively ITModule1, I have to point sonar to the file generated in the IT module.
So the best location to gather the coverage data is to the use rootModule, i.e. rootModule/target/jacoco-it.exec and append the results of all IT tests to that file.

I use the following plugin configuration that uses separate files for unit-test coverage (don't forget the append flag otherwise overall coverage will be incorrect) and the central file for IT covergage.
<plugin>
  <groupId>org.jacoco</groupId>
  <artifactId>jacoco-maven-plugin</artifactId>
  <version>0.7.4.201502262128</version>
  <executions>
    <execution>
      <id>prepare-agent</id>
      <goals>
        <goal>prepare-agent</goal>
      </goals>
      <configuration>
        <destFile>target/jacoco.exec</destFile>
        <append>true</append>
        <propertyName>surefireArgLine</propertyName>
      </configuration>
    </execution>
    <execution>
      <id>prepare-it-agent</id>
      <phase>pre-integration-test</phase>
      <goals>
        <goal>prepare-agent</goal>
      </goals>
      <configuration>
        <destFile>${session.executionRootDirectory}/target/jacoco-it.exec</destFile>
        <append>true</append>
        <propertyName>failsafeArgLine</propertyName>
      </configuration>
    </execution>
  </executions>
 </plugin>
The ${session.executionRootDirectory} property is the root of execution, when I build the entire project, it will point to the rootModule. So this is the best path to use, when you have multi-module with more than one level of nesting.

For the analysis, I need to point sonar to use that file when analyzing IT coverage. So I have to set the sonar.jacoco.itReportPath to that file. Unfortunately, this does not work with the session.executionRootDirectory property and I have to set the absolute path to the file manually. I do not recommend to specify the absolute path in the pom.xml as this path is specific to the build environment. So either set the path in Sonar or as System property of your build environment. I set it directly in the Sonar Project Settings (Java > Jacoco), for example /opt/buildroot/myProject/target/jacoco-it.exec. Now sonar will check that file for the IT coverage analysis of each module.



Wednesday, May 27, 2015

Scribble 0.1.3

 While working on the next release of Inkstand, I had to fix some bugs in the Scribble test framework's injection support which just got released.

Release Notes - Scribble - Version 0.1.3

Bug

  • [SCRIB-26] - Check for null injection target
  • [SCRIB-27] - TcpPort has no proper toString() representation
  • [SCRIB-30] - Field candidates are not collected for null-values

Story

  • [SCRIB-29] - Injection does not fail if no target is found

Tuesday, May 26, 2015

Java XML Processing Vulnerabilities

Last week I was fixing issues for my pet project Scribble. I use Sonar for capturing issues in my code. Since April this year, the Findbugs plugin for Sonar includes rules for finding security bugs. Two of the bugs found were related to XML processing using Java's XML APIs for Xpath and DOM parsing. The security issue themselves were not new, both of them were discovered some years ago. But to me they were new as I was not aware of them at all. For my pet project they are not that critical as it is just a framework for writing tests and no one using that framework is kept from writing vulnerable code themselves. But for me it was a good case for studying the issues to avoid them when it really matters.

 

Xpath Injection

Xpath injection adheres to the same principle as SQL injection were parameter values that are used in an Xpath expression contain characters that are semantically bound to the Xpath syntax to break out from the path defined by the expression.

The Attack

Given, you have an XML document containing sensitive data

<technical-users>
  <user id=”reader”>
  <privateKey>ABC</privateKey>
  </user>
  <user id=”writer”>
  <privateKey>123</privateKey>
  </user> 
</technical-users>
and an Xpath expression with a parameter that is filled in at runtime:
//technical-users/user[@id='”+userId+”']/privateKey
Lets assume, the attacker has authenticated successfully as reader and now tries to query for the private key, manipulating it's own user id to that value:

reader']/../user[@id='writer

The injected value leaves the reader-user subpaths, traverses one level up and down into the writer-users subpath and thereby delivering the privateKey of that user. A variation of this attack is if the authentication data of a webapp is stored in xml, i.e. an XML database. With a forged userId the system can be tricked to authenticate without a proper password

The Defense

The only effective defense is to sanitize the user input! Typically, a regex-pattern could help with allowing only input of a certain pattern, i.e. allowing only alphanumeric characters and within a specific length range (5 to 15 characters):
if(!userId.matches([a-zA-Z0-9]{5,15}) { 
  throw new Exception(“Invalid Input“); 
}
If reserved characters should be allowed, you may escape them:
String escapedUserId = userId.replaceAll(“'“, “\\'“);
Although that may be prone to further injection to circumvent the escaping, so it should be thoroughly tested if self-implemented. Both pattern matching and escaping could be encapsulated in a javax.xml.xpath.XpathVariableResolver that is registered at the Xpath instance. The following example shows a sanitizing variable resolver that accepts a set of regular expressions to check the parameters that should be resolved
public class SanitizingVariableResolver implements XPathVariableResolver {
   //create a map to contain the variable values
  private Map<QName, String> variables = new HashMap<>();
  //keep a list of all valid patterns
  private final List<Pattern> validationPatterns;

  //constructor accepting regular expression patterns
  public SanitizingVariableResolver(String... regexPatterns){
    this.validationPatterns = new ArrayList<>();
    for(String regexPattern : regexPatterns) {
      this.validationPatterns.add(Pattern.compile(regexPattern));
    }
  }
  //method to add variable value on which the sanity check is applied
  public void addVariable(String name, String value) {
    for(Pattern pattern : validationPatterns){
      if(pattern.matcher(value).matches()){
        variables.put(new QName(name), value);
        return;
      }
    }
    //don't accept invalid values
    throw new IllegalArgumentException("The value '" + value + "' is not 
      allowed for a variable" );
  }
  @Override
  public Object resolveVariable(QName variableName) {
    return this.variables.get(variableName);
  }
}
Next, you'll have to apply this resolver to your Xpath instance and use an Xpath expression with a variable placeholder:
//create new xpath instance
final XPath xp = XPathFactory.newInstance().newXPath();

//instantiate the resolver with an alpahnumeric pattern
final SanitizingVariableResolver resolver = 
  new SanitizingVariableResolver("[a-zA-Z0-9]{4,15}");

//add the user id value
resolver.addVariable("userId", userId);

//assign the resolver to the xpath instance
xp.setXPathVariableResolver(resolver);

//apply the xpath expression with variable
xp.evaluate("//technical-users/user[@id=$userId]/privateKey",source);

An alternative to sanitizing the input yourself, you may use alternative libraries such as Xquery that provides an abstraction layer on top of the Xpath API that provides means to sanitize parameter input.

References

 

XML External Entity (XXE)

XML document have to be well-formed and may be validated. For validation, there are two options for declaring a structure against which the document is validated: Doctype Definition (DTD) or XML Schema. A DTD may be embedded in the document itself. For XML the concepts of entities exist to describe characters or values that are parsed and replaced by the XML processor. A common example is the &-entity for describing an ampersand character ('&') because the '&' is a reserved character in Xml. Within a DTD custom entities can be declared. Values for those entities could be characters but also the content external resources indicated by an URI.

The Attack

In an XXE atttack, the attacker sends a perpared XML file containing a malicious entity. The entity points to an external resource containing a secret, i.e. /etc/passwd. Depending on what the service actually does, the attacker may easily read the secret from the parsed document.
A prepared XML document may be

<?xml version="1.0"?>
<!DOCTYPE document [
    <!-- placeholder for the attacked file url -->
    <!ENTITY xxe SYSTEM "/etc/passwd" >
]>
<document>
    <!-- the external entity is replaced with the injected value -->>
    <property>&xxe;</property>
</document>
When being processed by DocumentBuilder, the &xxe; is resolved to the content of /etc/passwd and accessible as text content of the element. The attack is also valid for processing XML with the SAX parser.
 

The Defense

There are several options to fix this vulnerability. Probably the easiest one is to use XML-Schemas only for XML validation and disable the Doctype Declaration feature by setting the DocumentBuilderFactory Feature http://apache.org/xml/features/disallow-doctype-decl to true:
DocumentBuilderFactory f = DocumentBuilderFactory.newInstance();
f.setFeature("http://apache.org/xml/features/disallow-doctype-decl", true);


This feature however is only supported by Xerces2. If you're on Xerces 1 or you can not disable Doctype declaration, you could disable the features

Xerces 1
http://xerces.apache.org/xerces-j/features.html#external-general-entities
http://xerces.apache.org/xerces-j/features.html#external-parameter-entities
Xerces 2
http://xerces.apache.org/xerces2-j/features.html#external-general-entities
http://xerces.apache.org/xerces2-j/features.html#external-parameter-entities
Sax in general
http://xml.org/sax/features/external-general-entities
http://xml.org/sax/features/external-parameter-entities

and set on the DocumentBuilderFactory the flags
dbf.setXIncludeAware(false);
dbf.setExpandEntityReferences(false);
 
Oracle proposes two alternative approaches. The first is to perform the parse operation in a privileged context with a no-permission ProtectionDomain where the java security policy is effective, preventing access to restricted system files. The second is to use an EntityResolver and allow only entities that match a certain pattern.
Further attacks against DTD, Schema and Entities and how to defend against are discussed in XML "Schema, DTD, and Entity Attacks"(pdf).

References


All examples, including JUnit tests that can be used as template to tests your own code can be found on https://github.com/gmuecke/whoopdicity/tree/master/examples

Inkstand Release 0.1.3

Yesterday I released 0.1.3 of the Inkstand microservice framework fixing some bugs and added some minor improvements.
 

Bug

    [INK-31] - Sonar Issue - Security - XML Parsing Vulnerable to XXE (SAXParser) in JCRContentLoader
    [INK-17] - Apply Apache Licence to code base
    [INK-29] - Findings in Code Inspection
    [INK-32] - remove log4j2.xml from core
    [INK-41] - Wrong Logging Statements in ServiceLauncher

Task

    [INK-4] - Document Inkstand in wiki
    [INK-5] - Set up Test Quality Assessment
    [INK-6] - Set up Build for master branch
    [INK-34] - Update to Apache Jackrabbit 2.10.1
    [INK-35] - Update Apache DS to 2.0.0-M20 and LDAP API to 1.0.0-M30
    [INK-36] - Update Undertow to 1.2.6.Final
    [INK-37] - Update Apache Deltaspike to 1.4.0
    [INK-40] - Update Scribble to 0.1.2

Tuesday, May 19, 2015

Scribble 0.1.2

Today I released version 0.1.2 of the Scribble testing framwork. Beside some bugifxes, the main improvement was the added support for initializing the JCR ContentRepository test rules with nodetype definitions from a CND file.

 Bug

    [SCRIB-16] - @Inject Annotations is not considered
    [SCRIB-19] - InjectableHolders are not recognized properly when injecting
    [SCRIB-23] - Sonar Issue: The use of XPath.evaluate() is vulnerable to XPath injection
    [SCRIB-24] - Sonar Issue: The usage of /DocumentBuilder.parse(...) is vulnerable to XML External Entity attacks
    [SCRIB-25] - Sonar Issue: Use a cryptographically strong random number generator (RNG) like "java.security.SecureRandom" in place of this PRNG

Story

    [SCRIB-11] - Convenience Methods for InMemory and StandaloneRepository creation
    [SCRIB-20] - Initialize Repository with CND node types

Task

    [SCRIB-17] - Set up Build for master branch
    [SCRIB-18] - Set up Test Quality Assesment
    [SCRIB-21] - Update Apache DS Dependency to 2.0.0-M20
    [SCRIB-22] - Fix "Copyright and license headers should be defined" Rule configuration

Tuesday, May 12, 2015

Off to new horizons

April 30th was a weird day. It started like a normal day and ended in me being fired out of blue 20 minutes before I had to leave to pickup my kids from childcare. The reason I was told was that I don't fit into the company's culture, which is kind of weird after 2.5years of hard working making customers happy and having a proven track record of outstanding results.

What happened?

Well, I don't know, but I can only speculate. With the last project I was involved in, the company tried to enter the market of software vendors. The company itself has a long history of being successful in consulting, but had no experience so far in producing software themselves. We adopted Scrum as method and I had the role of Solution Architect or Architecture Owner (you name it) and Scrum Master. We made good progress having to tackle lots of obstacles especially as the stack we used was new to us, same as the persistence layer.

My personal goals as scrum master were to pave the way for bringing the product to it's first go-live, second aiming for high quality as a small company cannot afford producing crappy software and third, increase transparency of the project's progress as a lot of the stakeholders were working at customers and were not located in the office. I also stood up for protecting the team from unnecessary overtime as the effect on declining quality is well known. And I tried to mitigate unrealistic expectations from wishful thinking to what's realizable, always embracing challenges. I really took my job seriously and diligently. But I assume it was too much for a company that not fully embraced the agile idea. I guess in the end it was a personal conflict that grew on one end, totally hidden from me, and no one ever made attempts to solve it in a professional way. Stories about scrum master's being fired for taking their job seriously are not unknown, and now it was my turn. Bad luck I'd say.

My advice for other scrum masters, if your stakeholder are not available physically most of the time find ways for effective communication rather sooner than later. And if your stakeholders are detached, try to re-connect them with the project and what's actually going on. Though that might just reduce but not remove the risk.

Anyway, so now it's official, I'm looking for a new Job!
So if you have or hear of an open position as Scrum Master, Software Engineer or Architect, drop me a message.

And while I'm on job search, I'll spent some time on my new pet project, a microservice framework named Inkstand! If you're interested, I'll invite you come around, have look, drop a comment or join!

Tuesday, February 3, 2015

Mutation Testing

End of January I attended the OOP2015 conference in Munich. Among the load of interesting sessions was one that left a mark. It was the workshop conducted by Filip van Laenen and Markus Schirp about Mutation Testing (slides here), which I'd like to summarize in this post.

What is Mutation Testing?

Mutation testing is a method to ensure the quality of your Test. With mutations testing you verify if your tests not only invokes your product code pretending to cover lines and branches but that the tests actually reflects the semantics of your code. While your tests guard your product code from bugs, mutation testing guards your tests suite from critical gaps.

How does it work?

Mutation testing is available for a set of languages. The implementation for Java is the PIT. PIT modifies the compiled byte code by applying a set of predefined rules, so call Mutators. Each change to the byte code is called a Mutant. Against the altered byte code your unit tests are executed, resulting in three outcomes:
  • The tests fails. This proves, that your tests detects the changes to the byte code correctly and reports an error. This outcome is called a Killed Mutant.
  • The test does not fail, but the line of code is covered by the test execution. This proves, that your test does not cover the full semantics of the product. This outcome is called a Survived Mutant (I call them Survivors).
  • The product code is not covered by a test at all but contains a mutant. This is outcome is an Uncovered Mutant (I call them Lurker), which can also be indicated by a Code Coverage tool such as Cobertura or jacoco.

What's the use of it?

Every mutant not killed by a test is a blind spot of your test suite and may become an actual bug in the long run. With mutation testing you can determine, whether your Unit Test - the detail specification - covers all the semantics of your product code. For every uncovered mutant you may either decide if your specification (test) is incomplete or your product code contains unneeded semantics which should be removed. That way mutation testing leads to smaller and simpler code and further is an indicator, when you are done with testing.

Its no silver bullet

Mutation testing is no silver bullet as it does not replace the need for properly designed tests in the first place. Testing for all combinations of Mutations may become a quite resource consuming operation, requiring to reduce the scope of mutation testing in larger projects on the crucial parts.
Further, some survived mutants may have to be accepted. Survived mutants may easily occur and may not necessarily be an indicator that your coverage is insufficient. For a simple example, think about logging code or the equality in some comparison cases, like ( a > b ? a : b) which in Java is equal to (a >=b ? a : b).

How to use it?

To use it on a maven project, an official maven plugin is avaible (see link for a good documentation how to use it). There are options for command line or Ant as well. This will create a report in html or xml. You may narrow the scope of the mutation testing using include and exclude parameter. The report itself run on a regular basis is already a good source of information for conduction code reviews.
For using it with SonarQube, a plugin is available, but its current version 0.5 is only compatible with SonarQube prior to version 4.2. There is a fork on github, making it compatible with Sonar API version 4.3 and I am working on a version for SonarQube 5.0 with some more rules. Both have to be compiled and deployed manually to Sonar.

TL;DR

Mutation Testing alters your product code in a deterministic way and verifies if your test suite finds the induced bug. This leads to smaller and simpler source code and you can tell when your're done with testing.