Categories
Atbash Overview Security

Atbash Summer release train

Introduction

All the Atbash repositories are still under heavy development, that is why they are released in one go. The last few days, such a release of almost all libraries is performed.

This gives a short overview of what you can find.

Big features

The big feature changes can be found in

  • Atbash JWT support related to cryptographic key support.
  • Atbash Rest client, a Java 7 port of the MicroProfile spec.
  • And Atbash Octopus where KeyCloak and MicroProfile JWT auth spec and interoperability between schemes are central in this release.

Cryptographic key support

Since there are many formats in which keys can be persisted (PEM, Java Key Stores, JWK, etc …), they are all internally stored as an AtbashKey. It contains the Key itself (as Java object), the identification and the type of the key (like RSA, private or public part, etc …)

Creating such keys can be achieved by using the class KeyGenerator, with the method generateKeys(). This class is available as CDI instance or can be instantiated directory in those environments/locations where no CDI is available.

The parameter of the generateKeys() method, defines which key(s) is created. This parameter can be created using a builder pattern.

RSAGenerationParameters generationParameters = new RSAGenerationParameters.RSAGenerationParametersBuilder()
        .withKeyId("the-kid")
        .build();
List<AtbashKey> atbashKeys = generator.generateKeys(generationParameters);

In the above example, multiple keys are generated since RSA is an asymmetric key and thus private and public parts are generated.

Writing of a key can be performed with the KeyWriter class. It has a method, writeKeyResource, which can be used to persist a key into one of the formats. The format is specified as a parameter of type KeyResourceType. This can indicate the required format like PEM, Key store, JWK, etc…

The specific type of PEM (like PKCS1, PKCS8, etc …) is defined by the configuration parameters.

Another parameter defines the password/passphrase for the key (if needed) and one for the file as a whole in the case of the Java KeyStore format for example.

The last functionality around key is then reading of all those keys in the supported format. This functionality is implemented in the KeyReader class. It is again a CDI bean which can be instantiated when no CDI environment is available.

It contains a readKeyResource() method which can read  all the keys in a resource (like PEM file, Java Key Store, JWK, etc …) As a parameter, an instance of KeyResourcePasswordLookup is supplied which retrieves a password in those case where it is needed (to read the file or decrypt the key)

The return of the method is a list because a resource can contain more than one key AtbashKeys.

This Key support is an initial version and will be improved in the further releases of the atbash-jwt-support releases with more features and more supported formats.

Atbash Rest Client

A first release was done mid-June and contained an implementation in Java 7 for Java SE and Java EE which is compatible with the MicroProfile Rest Client specification. (see here) It allows you to ‘inject’ or create (useful in Java SE environments) a system generated Rest client based on the definition of your JAX-RS endpoint defined in an interface class.

In this release, the RestClientBuilderListener from the MP Rest Client spec 1.1 is added and implemented so that we can define some additional providers in a general way. This is important for the Octopus release so that we can add the credentials, stored within the Octopus context, to the JAX-RS call automatically. Without the need to specify the providers manually.

Atbash Octopus

And of course, many new features are added to Octopus. They are migrated from the old Octopus or newly added.

The highlights are:

– Added support for KeyCloak server. JSF applications can use the authentication and authorization from KeyCloak configured realms. Also, the AccessToken from it can be based on in the header of other request and verified by JAX-RS endpoints. The only thing which is needed is the location of the KeyCloak server and the realm config in JSON (which is supplied by KeyCloak)

– The SPI option to pass the expected password for a user can now handle hashed passwords. Both the ‘standard’ algorithms from MessageDigest, like SHA-256 but also the key derivation function PBKDF2 can be defined easily.

– The authorization annotations, like @RequiresPermissions, can be specified on JAX-RS methods without the need to define those resources as CDI or EJB beans.

– Authentication and authorization information can be converted automatically to an MP JWT Auth compliant format and used in calls to JAX-RS endpoints. This makes it possible for example integrate JAX-RS resources protected by KeyCloak and MP JWT seamless.

And too much other features to describe here in detail. The user manual is also started and will be announced soon.

Overview all released frameworks

Utilities : 0.9.2

Set of utilities for Java SE, CDI and plain JSF which are very useful in many projects running in one of these environments.

  • Added utility class for HEX encoding (next to the BASE64 encoding)
  • Added support for byte arrays and encoding (HEX and BASE64) through the ByteSource class.

JSON-smart : 0.9.1

A small library (for Java 7) which can convert JSON to Java instances and vice versa.

  • Added support for @JsonProperty to define the name of JSON property.
  • Contains an SPI so that other naming annotations (like Jackson one) can be used.

Abash-config : 0.9.2

Extension for the MicroProfile Config implementations. Also a Java 7 port of Apache Geronimo Config.

  • Configuration for the base name (with serviceLoader class) is optional.
  • Port of MicroProfile Config 1.3 features to Java 7.

JWT Support : 0.9.0

Convert Java instances to JWT and vice versa and extensive support for Cryptographic keys (reading, writing, creating) supporting multiple types (like RSA, EC, and HMAC keys) and formats (like JWK, JWKSet, PEM, and KeyStore)

  • Support for reading and writing multiple formats (PEM, KeyStore, JWK and JWKSet).
  • Better support for JWT verification with keys using the concepts of KeySelector and KeyManager.

Atbash config server : 0.9.1

Configuration source for MP Config as a server supplying config through JAX-RS endpoints.

  • Added Payara micro as supported server to serve the configuration.

Atbash Rest Client : 0.5.1

Rest client implementation for Java 7.

  • Included RestClientBuilderListener from MP Rest Client 1.1 (to be able to define providers globally)

Octopus : 0.4

  • Integration with Keycloak (Client Credentials for Java SE, AuthorizationCode grant for Web, AccessToken for JAX-RS)
  • Supported for Hashed Passwords (MessageDigest ones and PBKDF2)
  • Support for MP rest Client and Providers available to add tokens for MP JWT Auth and Keycloak.
  • Logout functionality for Web.
  • Authentication events.
  • More features for JAX-RS integration (authorization violations on JAX-RS resource [no need for CDI or EJB], correct 401 return messages, … )
  • Support for default user filter (no need to define user filter before authorizationFilter)

Conclusion

The release contains a lot of goodies related to secure. In the comings months, new features will be added, support for Java 8 and 11 are planned and user manuals and cookbooks will be available to get you started with all those goodies.

The Atbash repositories with some more info and the code of course, can be found at GitHub.

Have fun.

Categories
Atbash Overview

Atbash repositories overview

Now that I’m working a bit more than 6 months on the Atbash repositories, it is time to give you some overview of them.

For the moment, almost all of them are geared towards Java EE 7 and Java 7.

The idea is to create a MicroProfile compatible experience on Java EE 7. As much as possible of course. Since MicroProfile is based on Java 8, it is not always easy (or possible) to have an identical experience.

But the idea is to give the developers the possibility to have a “smooth” migration from Java EE 7 to MicroProfile by having all or the most important specifications, available on those servers.

Utility repository

Github

This contains some code used in multiple other Atbash repositories. It is not directly related to the goal but can have some usages in any project.

utils-se

– BASE64 encoder and decoder
– Utility class related to searching classes and resources on different class loaders (Current Thread, class loader of utility class or System class loader) and instantiating classes with advanced argument matching.
– Reading library or framework version from manifest file
– Check to see if an application is running within a CDI container
– Utilities related to verification and handling proxies (like determining original class)

– Reflection utilities useable in unit testing (so that we can read and set private properties without the need for setters and getters which would only be needed for testing)

utils-cdi

– Programmatic retrieval of CDI Beans, also for an optional bean (bean defined by Producer method which may or may not be present)
– Retrieval of a bean defined by Producer method involving generic types (issue due to type erasure)

– Fake bean manager useable within unit tests to supply some Mock Cdi bean instances when using programmatic CDI bean retrieval.

utils-jsf

– Creating a method expression based on the expression value
– Retrieving property values from JSF components (taking into account expressions and static values)
– Custom component finding which starts at the parent itself but then extends to the parent until found or view root is reached.

Atbash JSON (Java SE)

GitHub

Alternative JSON-B implementation (not following the specs!) to convert Java instances to JSON and vice versa. The code is adapted from the JSON smart framework (no longer maintained) for the use cases within Atbash (see Atbash JWT support) and extended to have customizations.

JWT Support (Java SE + CDI)

GitHub

The code in this repository is to support JWT (JSON Web Token, signed but also encrypted) as they appear in protocols like OAuth2 and OpenId Connect.

To support this encoding and decoding, there is also a unified handling of cryptographic keys. It is capable of reading them from a PEM file, a Java Keystore, a JWK and a JWKSet. Various encrypted formats for the Private Key are supported.

But the idea goes a bit further than just support OpenId Connect JWTs. Instead of exchanging data as JSON, why not exchange them as JWT. That way, we are not only transferring information but also make some guarantees like sender verification and end to end protection as we can detect changes through the signing.

Atbash Config (Java SE + CDI)

GitHub

The Atbash config is a Java 7 port of the MicroProfile Config specification and the Apache Geronimo implementation.

But there are also some extensions created like
– Support for custom named configuration files.
– Support for Stages so that based on a system property, values can be overridden in environments like Test environment.
– Support for custom formatted date values
– Logging of configuration values at startup of the application.
– Support for YAML format.

There is also a Config provider for testing available where the configuration values are stored in a HashMap.

Atbash Config server (MicroProfile Product)

GitHub

Allows defining the configuration values for MicroProfile application in a central place. These applications can retrieve the values using a JAX-RS endpoint.

There is also a client implementation available which, when added to the application, retrieves the values automatically, based on the configured endpoint of the config server.

Jerry (JSF)

GitHub

Jerry defines a JSF Renderer interceptor which allows you to perform various tasks on any JSF component.

It is the basis for a more advanced validation mechanism for JSF and the declarative security features of Octopus for JSF components.

Valerie (JSF, Bean Validation)

GitHub

Using the interceptor mechanism of Jerry, Valerie places the validation constraints of the Java properties (like not null, length etc) automatically on the JSF components it uses.
This way it allows to have the visual aspects of these constraints on the JSF component HTML representation and JSF validation without the need to put these constraints as JSF attribute values.

Octopus (Java SE, Java FX, CDI, JSF, JAX-RS)

GitHub

Octopus is a large framework consisting of small artifacts that bring you every aspect of authentication and authorization for Java EE and MicroProfile applications.

  • Permission-based framework
  • Secures URL, JSF components, and CDI and EJB method calls
  • Support for Java SE, JavaFX, JAX-RS and JSF
  • Integrates with OAuth2, OpenId Connect, LDAP, database, JWT, KeyCloak, CAS, …
  • Custom OpenId Connect solution within a microservices environment
  • Compatible with (and using) Microprofile (Config, Rest Client, MP-JWT, …), Java EE Security API, …
  • Very flexible, can be easily integrated within your application
  • Tightly integrated with CDI
  • Type-safe definition of permissions
  • Declarative declaration of JSF security (with tags, not using rendered attribute)
  • A custom voter can be created for more complex security requirements

Dependencies overview

The following Neo4J graph gives you an overview of the dependencies between these repositories and the relation with other libraries. Octopus is left out of the image because it would complicate it too much.

Have fun

Categories
JavaFX Security

Declarative Security permissions for Java FX FXML Views

Introduction

Using declarative permissions is the preferred way to have security constraints on GUI elements, separately from business logic and fine-grained because we are using individual permission/access rights.
With JavaServer Faces, you can add custom tags, but also within JavaFX FXML views you can do something similar. And recently I found the time to create a POC for an idea that I had some years ago.

Declarative Permissions

When defining indications within the view of your application, GUI elements can be active or not visible depend ending on the permissions the user has. By placing them within your view markup you have 2 major benefits:

  • The ‘definition’ when an element should be visible is done on the element itself and thus it is very clear on which element it operates
  • It isn’t mingled with your other business code.

Also, using permissions is much better and easier to work with than roles. When each feature is assigned specific permissions which the user needs to have, no application redeploy is needed when a feature can now be used by more users. You just have to assign the permission to the users.
Roles are assigned to users and thus when a single feature needs to be accessible by a group of people, you can’t assign the role to them as it would give them also access to other features. And thus you end up changing the code.

JavaFX FXML

With the FXML views, you can define how the view is presented to the end users. Just as with JavaServer Pages tags, it is also possible to create your own custom components to tailor the system to your needs.
Adding additional information to the standard components is also possible by means of the userData tag.

<Button mnemonicParsing="false" onAction="#permission" text="Only admin permission">
  <userData>
    <RequiresPermissions value="admin"/>
  </userData>
</Button>

You can add there information which can be used by your code later on. You can even add multiple data sets by using the JavaFX Collections.

Here in our case we add an instance of the Octopus JavaFX RequiresPermissions class and specify that the user needs the permission admin before the button will be visible.

In order to make this work, you need also to import the class (just as within regular java code)

<?import be.atbash.ee.security.octopus.javafx.authz.tag.RequiresPermissions?>

Processing the view

In order to remove the component from the view when the user hasn’t the required permissions, we need to scan the view just before it is ‘rendered’ to the screen.
This can be easily achieved by the excellent framework afterburner which was a method getView() defined within the FXMLView class.

Octopus defines a child class, be.atbash.ee.security.octopus.javafx.SecureFXMLView, which performs the following steps when the view is retrieved

  • Scan all view nodes
  • Verifies if there is an instance of the RequiresPermissions within the userData section
  • Checks if the user has the permission or not.
  • If (s)he has not, sets the node to not visible (node.setVisible(false) )

Octopus Java FX support

Next, to the processing of the view, Octopus has now also support for JavaFX views by helping the developer in creating a Login screen and the integration with the Octopus Core classes which can perform the required steps for authentication and authorization.

This support is built on top of the afterburner framework which makes developing JavaFX applications easy.

You can start the application by using the start method from the be.atbash.ee.security.octopus.javafx.SecureFXMLApplication class.

SecureFXMLApplication.start(stage, new LoginView(), new UserPageView());

The loginView itself must extend from be.atbash.ee.security.octopus.javafx.LoginFXMLView and contain components with ids user, password and loginButton.

The start() method must also a receive an instance of the first real page which will be shown after the user is authenticated. This is a FXML view where the java class extends from be.atbash.ee.security.octopus.javafx.SecureFXMLView. These views are scanned for their contents within the userData section and component are hidden when the user doesn’t have the required privileges.

Page navigation can be performed by be.atbash.ee.security.octopus.javafx.SecureFXMLApplication#showPage and this method also perform the hiding of components based on the authorization info of this view.

Demo

The Atbash Octopus framework code has a demo application to show this feature in action. It can be found at GitHub .

For those who want to have a quick idea how it looks like, I recorded also a short video which demonstrates the demo

Conclusion

Declarative permissions can also be used on JavaFX FXML views, just as you can do this with JavaServer Faces. Instead of custom tags who defines the required permissions, you have to define the permission requirements within the userData section which is available within each JavaFX component.
This is only a POC and the functionality will be extended in future versions of Atbash Octopus (as the rewrite of the Octopus framework is currently still in Alpha)

Have fun with it

Categories
Project setup

Generating (Maven) Projects

Introduction

The first thing we do when starting a new application is setting up our project. A lot of people use Maven for this, and you have various options how you can do this.
Since I create a lot of projects for testing out various features of the Atbash projects or for trying things out (I have around 300 projects on my computer today) I was searching for some help to do this.

Application setup

When trying things out, I need to have some minimal project setup. Not only do I need the Maven pom file with the required dependencies, I also need some files like configuration or a login file for Octopus.

I do not need to have tools to generate me the JPA entities, JSF Screens or JAX-RS endpoints. Creating a few of them is not difficult with the current IDE’s and that automatic generation is mostly too opinionated so that it is not usable in production systems either.

So what are the options to have this minimal project setup

Copy and Paste

You can always copy such a minimal project you have already created and made some modifications to

  • Change the groupId and artifactId of the pom file
  • Update the dependencies
  • Update the configuration file to match your new dependencies and use case you want to test out.
  • Change the java package with your java code
  • Add or remove code depending on the feature for this use case.

This is a valid procedure and in most cases the most efficient one. You don’t need to learn any new tool, extend it with your features (Like Octopus in my case) but of course, it is not the most fun way of doing things.

JBoss Forge

JBoss Forge https://forge.jboss.org/ is a tool specifically designed to get your application up and running fast. It not only generates the project for you but also can help you in generating those well-structured classes like JPA entities, JAX-RS endpoints, and screens with JSF or Angular(JS).

I used it in the past and it can be a valuable tool but it is not the right thing for what I want.
Although you can generate your maven project, you need a script and/or a custom add-on for the features that I want (specific dependencies within the pom.xml file and some configuration and java files)

It also has a lot of options to generate code for you, but again this is sometimes opinionated and not what you or your company wants to use.

Other tools

Since there are various other tools on the internet, the topic seems to be important for a lot of people. Similar tools which I found are generjee and factorEE.
Not all of them are maintained anymore or do focus on the wrong things (for my use case) like generating code.
I also find a CLI very important in this case since it allows you to create an application very rapidly (and recreate it based on the same configuration)

Jessie

So, due to the lack of the right tool for me, I created one of my own called Jessie (Java EE Start project as it was geared at that time towards Java EE). Last year with the release of CDI 2.0, I did a rewrite of it using the Java SE CDI container and included the usage of ThymeLeaf as templating mechanism so that it became easier to generate files which can be customized.

In the OpenSource spirit of Atbash, I placed the code on Github  and created a front end for it.
You can play with it on the OpenShift V3 instance  or can download the code that just needs a few parameters within a YAML config file and the CLI version generates your project.

The first version has support for Java EE and additional frameworks like DeltaSpike, PrimeFaces, and Octopus, the security framework.

Since I’m also involved with MicroProfile lately, a version for MicroProfile and the support for the different vendor implementation like LibertyIO, WildFly Swarm, Payara Micro etc will be created later on.

Conclusion

When setting up a new project, the most efficient solution is creating it from scratch or start from a similar existing project and make the required adaptations.
Another useful tool is JBoss Forge, mainly when you are interested in generating some code.

Jessie has been created a few years ago as a toy and now available on GitHub and it was a good learning opportunity for CDI 2.0, templating with ThymeLeaf and creating a tool which runs in multiple environments (CLI, Java FX, Web/JSF)

Maybe it is also helpful for you.

Have fun.

Categories
Configuration

Implicit converters for MicroProfile Config

Introduction

Last month (January 2018), an update of the MicroProfile Config specification, version 1.2, was released which makes it easier to convert the configuration values to class instances.
It adds the concept of implicit/automatic converters and removes the requirement for creating specific converters in many cases.
The first implementations are already available and Atbash Configuration is now also updated.
This blog describes what the implicit/automatic converters are and how you can use them.

Converters

The configuration values we define are Strings because they are defined as a set of Characters in the configuration files. Using the configuration could be limited to a system which is capable of reading those Strings and that the developers need to convert these values to the proper instance (like Integer, Boolean or application specific classes) themselves.
But in that case, each developer will write his own set of converters and in the case of Integers and Booleans, for example, they are all the same.
So the framework itself can take care of these converters. And since we have then already the concept of a convert, it is easy to foresee an SPI which allows the creation of custom Converters by the developer for their application specific classes.
These basic converters for Integer and Boolean for example where present since the first release of the spec. Just as the SPI to add your own converters.

Implicit Converters

But the creation of all those custom converters can be minimized further. Since our application specific classes can be created from a String, they probably have some means of instantiating them through a method which takes a String parameter
– The constructor
– A static method with names like valueOf or parse.
So with the new implicit/automatic converters feature, when we have some class like this
public class ClassWithStringConstructor {
    private String value;

    public ClassWithStringConstructor (String value) {
        this.value = value;
    }

    public String getValue() {
        return value;
    }
}
and have a parameter
some.key = Atbash

We can retrieve an instance of the class, with the value using

config.getValue("some.key", ClassWithStringConstructor.class)

or when we are in a CDI enabled context, we can also use injection

@Inject
@ConfigProperty("some.key")
private ClassWithStringConstructor parameterValue;

Collection type parameters

Another nice addition to this 1.2 release, is the use of collection type parameters. The ability to have multiple parameter values, like a list of values which is converted to an Array, List or Set.
It is quite common that a configuration value is, in fact, a list of value.
pets = dog,cat
previously, the developer needed to split the String and use the individual parts separately (although this is a very simple task with String.split() ), can be done automatically.
config.getValue("pets", String[].class)

With injection, we can automatically convert it to a List or a Set.

@Inject
@ConfigProperty(name = "pets")
private List<String> pets;

The List and Set variants can’t be used in a programmatic way (using config.getValue) since the second parameter doesn’t accept parameterized types (like List<String>).

When you have a value which contains a comma it in, you need to escape it
a,b,c\\,d

which results in 3 configuration values, where the last one is c,d.

Implicit combined with Collection.

We can even combine the collection type feature with the implicit/automatic converter feature. If we instead specify the class name like ClassWithStringConstructor instead of String, the configuration framework will use the appropriate way of converting.
config.getValue("pets", ClassWithStringConstructor[].class)

or

@Inject
@ConfigProperty(name = "pets")
private List<ClassWithStringConstructor> pets;

Octopus Config

As you could read in a previous blog, Octopus Config is a Java 7 port from Geronimo config. The support for the implicit/automatic converters and collections (like the arrays, List, and set) is taken from them.
Octopus Config itself is also updated so that the YAML usage supports the Collection type parameters. The example above with the pets can be written in the YAML equivalent format as
pets: [dog , cat]

The release 0.9.1 contains also some small other features are listed here. Most notable features are

– Improved Class converters based on different classloaders.
– Test artefact.
– Logging in Java SE environment
– Improvement information in logging with @ModuleConfigName
– Compatibility with another MicroProfile Config implementation on Java 8.

Conclusion

With the addition of the implicit/automatic converters, the need for writing custom converters can be dropped and conversion will take place by convention.
The Octopus Config version v0.9.1 is updated to incorporate these features together with some small other improvements and fixes.
Have fun.
Categories
Security

Key derivation functions

Introduction

Although initially intended for creating better secret keys, Key derivation functions are probably better for storing hashed passwords. This post tells you more about what a Key derivation function does, how you can use it for storing hashed passwords and how you can use it in the Octopus framework.

What is it?

For the symmetric encryption you need the secret key, an array of bytes basically. A password can also be converted to a byte array so a password can be used. Then the key doesn’t need to be stored and when the user types in the password, the message can be encrypted or decrypted.

But of course, passwords are not a really good candidate as the key for an encryption.
– passwords consist of characters which have all more or less the same bit pattern. So the ‘random’ spread of bits through-out the byte array is not good.
– password are mostly short, so your key is probably too small.

So that is where the Key derivation functions come into the picture. They take a password or passphrase (a sentence used as a password) and can generate a byte array out of this. It has the following characteristics
– It needs a salt, a byte array, to be able to do his work. So salting is mandatory.
– When the same password/passphrase and salt are presented to the Key Derivation function, it results in the same byte array outcome (so it is deterministic, no random outcome)
– It is a one-way procedure. So from the output buts, the original password / passphrase can’t be reconstructed.
– The output, the number of bits generated, is configurable (important because key sizes for encryption must have some minimum lengths)
– The functions are intentionally slow to counter any brute-force attack. And iterations can be applied to make it even more difficult.

You can read more background info on this wiki page

for hashed passwords

In the previous section, you can read why those Key derivation functions are very good for generating a secret key (for symmetric encryption). The properties are designed specifically for that purpose.

But very rapidly, they saw an alternative use for these functions, create hashed passwords. The reason why are easy to see
– When offered the same input, the same output is generated
– The input can’t be reconstructed from the output.

But they have a few other properties which make it a better candidate for storing passwords then the classic hashing functions.
– The computation requires a salt, so generating an output byte array is not possible without it. This in contrast to a classic hash algorithm where we add the salt to the password to overcome the use of rainbow tables.
– The computation of Key Derivation functions are slow, those of hash algorithms are fast. And performing the hashing function multiple times is not a good idea because of the increase in collisions.

Algorithms

So we are now at the point that we can say that Key Derivation functions are better for storing hashed passwords. So how can we get started with it in our Java programs?

There are different algorithms developed (just like there are different hash algorithms) like argon2, sCript, bCript or PBKDF2.
Only one is available by default on the JVM, PBKDF2, although most people say argon2 is the best algorithm. So let us see how we can use that one on the JVM.

Since Key derivation functions are something completely different then hash algorithms, these aren’t available from java.security.MessageDigest class. But there is another standard java class available which give you an instance of the algorithm, javax.crypto.SecretKeyFactory

The following snippets generate the BASE64 encoded output for a certain password. It also generates a salt value.

String password = "atbash";

String keyAlgorithmName = "PBKDF2WithHmacSHA256";
SecretKeyFactory keyFactory;
keyFactory = SecretKeyFactory.getInstance(keyAlgorithmName);

char[] chars = password.toCharArray();

byte[] salt = new byte[32];
new SecureRandom().nextBytes(salt);

int hashIterations = 1024;

int keySizeBytes = 32;

byte[] encoded = keyFactory.generateSecret(
        new PBEKeySpec(chars, salt, hashIterations, keySizeBytes * 8)).getEncoded();

String hashed = Base64.getEncoder().encodeToString(encoded);

System.out.println(hashed);

So using them us quite easily.

Octopus support

Also within Octopus, there is support added for the PBKDF2 algorithm within version 0.9.7.1. And since the way you use it is so similar then classic Hash algorithms, the only thing you need to do is set the algorithm name in the configuration file.

hashAlgorithmName=PBKDF2WithHmacSHA256

See also the chapter in the Octopus Cookbook

Conclusion

Key derivation functions are designed for creating a key which can be used in Symmetric encryption algorithms based on a password.

But it turns out that they are also very well suited to stored hashed passwords. So try them out and use them in your next project with or without the Octopus security framework.

Have fun.

 

Categories
Security

Java EE Security API integration with Octopus

Introduction

The Java EE Security API (JSR-375) goal is to enhance the security features of the Java EE platform and to make sure that no custom configuration of the application server must be updated anymore when you want to use for example hashed passwords stored in a database table.
It was released together with Java EE 8 in September 2017 and one of the main concepts which are defined is the IdentityStore which validates the user credentials (like username and password) and retrieves the groups (the authorization grants) the user has.

The Octopus framework is mainly targeted to authorization features like declarative permissions useable for JSF components (with a custom tag) or annotations (to protect EJB methods for example). There are many authentication integrations available like OAuth2, OpenId Connect, KeyCloak, CAS server and custom integrations for database usage to name a few.

Since v 0.9.7.1, released on 27 December 2017, it is possible to integrate the IdentityStores from JSR-375 within Octopus and most of all, it runs also on Java EE 7.

The demo code can be found in the Octopus demo repository.

Project setup

The JSR-375 integration is defined within a Maven artifact specifically created for this purpose,

<dependency>
    <groupId>be.c4j.ee.security.octopus.authentication</groupId>
    <artifactId>security-api</artifactId>
    <version>${octopus.version}</version>
</dependency>

Together with the dependency for JSF on a Java EE 7 server,

<dependency>
    <groupId>be.c4j.ee.security.octopus</groupId>
    <artifactId>octopus-javaee7-jsf</artifactId>
    <version>${octopus.version}</version>
</dependency>

they can be found in the Bintray repository

<repository>
    <id>Bintray_JCenter</id>
    <url>https://jcenter.bintray.com</url>
</repository>

These are the other dependencies which we need

  • Java EE 7 (Provided) all the features of the Java EE 7 platform
  • PrimeFaces (Compile) The JSF Component library
  • Deltaspike (Compile/Runtime) Required dependency of Octopus but set as provided within Octopus so it is easier to specify the version you want to use within your application
  • Soteria (Compile) JSR-375 implementation and required since we are using a Java EE 7 server.
  • H2 database (Runtime) Our database where we want to store hashed passwords.

Security config

In this example, I want to show you the usage of a JSR-375 defined IdentityStore and a custom one.

The default defined IdentityStore is the Database one which stores the passwords hashed. It can be configured by putting an annotation on a CDI bean.

@DatabaseIdentityStoreDefinition(
        dataSourceLookup = "java:global/MyDS",
        callerQuery = "select password from caller where name = ?",
        groupsQuery = "select group_name from caller_groups where caller_name = ?",
        hashAlgorithmParameters = {
                "Pbkdf2PasswordHash.Iterations=3072",
                "Pbkdf2PasswordHash.Algorithm=PBKDF2WithHmacSHA512",
                "Pbkdf2PasswordHash.SaltSizeBytes=64"

        }
)

The datasource is defined within the DatabaseSetup class, which ensures the correct tables are created and filled with a few credentials.

This @DatabaseIdentityStoreDefinition is placed in our demo on the CDI bean which defines the second IdentityStore which is consulted when the database store didn’t contain the username we specified.

This custom store is created by implementing the interface javax.security.enterprise.identitystore.IdentityStore and override the validate() method.

Here we check if it is a certain fixed user and if the correct password is supplied. If so, it returns a few groups. Of course, you can write any kind of logic in these custom stores when the default ones don’t fit your needs.

Authorization

JSR-375 and Java EE in general, expect that each user has some groups defined in the external system which are then converted in roles of your application.

Octopus can also work with roles but one should use permissions because they are far more powerful. It has very powerful permission support like named, wildcard and domain permissions.

The groups of the user are interpreted as follows:
– The group is converted to a role with the same name (for the case you want to work with roles within Octopus)
– The group is converted to a named permission.
– The group is converted to a set of permissions by a CDI bean implementing the RolePermissionResolver which needs to be supplied by the developer.

This last option is probably the best and most powerful way to convert a group to a set of permissions and thus getting the maximum out of the features of Octopus.

No full integration

The described solution here is to integrate the IdentityStores of Java EE 8 (JSR-375) into the Octopus framework. It does not use the JASPIC as the underlying mechanism as JSR-375 describes, nor is there any integration with the other security features of Java EE like @RolesAllowed.

Java EE 8

And your application, like the demo, will be ready to convert to Java EE 8 with a minimal amount of effort. The only thing you need to do is take the Java EE 8 dependency (instead of the Java EE 7 one) and remove the Soteria dependency.

There will be no other changes required to make your application run on Java EE 8 and make use of all the features in that latest release.

Conclusion

With the Octopus framework, you have today already the most powerful authorization features available. And by using you are preparing yourself for the future when Java EE also has the authorization features available which are planned within the Java EE Security API.

Have fun.

Categories
Security

Session Hijacking protection with Octopus framework

Session Hijacking

What is Session Hijacking? Most Web applications store information of the user ( the classic example is the shopping basket) in the HTTP session. But also the basic user information. So, whenever someone else knows the ID of the session, he can impersonate the victim and act on behalf of him/her.

The following image shows the attack (image by A. Mitra)

Octopus protects

But there exist a few techniques to identify if such an attack is in progress for a certain session. And this technique is implemented within the Octopus framework (from v0.9.7 on)

You don’t have to do anything specific to activate this features, it is on by default.

When such an attack is detected, the attacker gets a blank page with the message that access was denied due to a Session Hijacking attack detection. But also the actual victim can be informed of the attempt to take over his/her session.

See the following short video to see the feature in action.

Youtube video

The code used to create the video is on GitHub.

Have a look at the parameter session.hijacking.level to configure the feature in case you have issues or want to deactivate it.

Conclusion

With the Session Hijacking protection, Octopus tries to make the applications as safe as possible. It is only one of the security features it has. You saw also in the video that the session ids are changed when the user logs on and logs off, just as OWASP recommends it.

Have fun.

Categories
Security

Release Octopus v0.9.7

Introduction

I’m happy to announce the latest version of the Java EE Security framework Octopus, version 0.9.7.

After more than a year of development and testing and 2 projects where the major new features are already used, we feel it is time to release a final version of the framework.

The theme of this release is Self-Contained Systems. (Only for Java EE 7+)

  • Octopus SSO server compatible with OpenIdConnect protocol.
  • Transfer authentication and permission information within the header of JAX-RS rest calls.
  • Support of String based permissions.

But there are other important new features like

  • Support for RBAC – Role Based Access Control.
  • Implementing OWASP recommendations about Session management.
  • Support for 2-step authentication with OTP (One-time password)
  • Possibility to create a more advanced (then a custom voter) security annotation.
  • Initial support for Java SE

Self-Contained System support

Octopus SSO Server

in this announcement, I want you to tell something more about the support for Self-Contained Systems (SCS).

You can see them as a variant of microServices, the important feature about an SCS is that it is responsible for the data of a certain domain (like products or orders) and that it contains all the business logic and UI to work with the data.

Interaction, preferable asynchronously, with other SCS goes through the UI of them since Web pages and REST endpoints are both considered as UI.

The image shows these basic features of an SCS.

For an excellent introduction, I can recommend the site http://scs-architecture.org/

With this latest release of Octopus, implementing security within an SCS architecture becomes quite easy.

With the Octopus SSO server, you can turn one of the SCS in charge of the security.Based on the OpenIdConnect protocol, another SCS (clients) can delegate the authentication process to the Octopus SSO server. An additional scope, ‘Octopus’ makes it possible that the permissions of the authenticated user are transferred to the client.

Based on the OpenIdConnect protocol, another SCS (clients) can delegate the authentication process to the Octopus SSO server. An additional scope, ‘Octopus’ makes it possible that the permissions of the authenticated user are transferred to the client.

The image below gives an idea how it works (Authorisation grant flow)

The maven artefacts that one can use for this feature are
be.c4j.ee.security.octopus.sso:octopus-server and be.c4j.ee.security.octopus.sso:octopus-client

Because permissions need to be transferred from one SCS to another, it becomes easier if these permissions are just Strings. Although a more type-safe approach like enum values, here String are more pragmatic. (example further in text)

JAX-RS communication between SCS

Also, the communication with a JAX-RS endpoint can easily be secured with the SCS user modules.

The idea is that the information of the currently authenticated user is packed within a JWT and send in the header of the REST call. The server endpoint uses this info from the header, to recreate the principal and his permissions.

The normal Octopus security annotations, like @RequiredUser or @OctopusPermissions(“order:read:*”) can be used on the server side (containing the JAX-RS endpoint)
When the calling user doesn’t have the permission “order:read:*”, automatically a status 403 is returned.

On the server side, you can have something like this

@Path("/user")
public class UserController {

   @GET
   @OctopusPermissions("User:Read:*")
   public List<User> retrieveAllUsers() {
      ...
   }
}

The client side, for calling this endpoint can look like this

public class UserControl {

   @Inject
   private OctopusSCSUserRestClient restClient;

   public List<User> loadAllUsers() {
      regturn restClient.get(hostURL + "user", User[].class);
   }
}

Be aware that direct calling one SCS from another results in a tight coupling which is not suited in most situations.

The maven artifacts that one can use for this feature are be.c4j.ee.security.octopus.authentication:jwtscs-server and be.c4j.ee.security.octopus.authentication:jwtscs-client

Documentation

The user manual can be found here and example programs are created and will be available soon on GitHub.

More examples and better documentation can be expected when I migrate the Octopus framework into the Atbash organization of Github in the coming months.

Compatibility

The last section about compatibility with the current application servers. Some of the new features in this release of Octopus are only working on Java EE 7+.

Some of the new features in this release of Octopus are only working on Java EE 7+.
But in general Octopus framework can be used on any Java EE 6, Java EE 7 or Java EE 8 compliant server.

Yes, Octopus is tested with GlassFish 5 server and no issues are found. Just use the Java EE 7 version artifacts of Octopus and they will work just fine.

Later on, there will be Java EE 8 specific artifacts which use the JSONB api instead on the net.minidev:json-smart artefact.

 

This website uses cookies. By continuing to use this site, you accept our use of cookies.  Learn more