Sunday, December 11, 2011

JavaOne Latin America 2011

Last December 5, 2011, I traveled to São Paulo in order to attend to JavaOne Latin America 2011. I wrote this post in conjunction with Adrian Biga (Neoris). This way, we were able to attend sessions which were presented at the same time :).

We're going to write about the technical presentations that we watched.

Since this is going to be a pretty long post, I'm going to start with the acknowledgements. I'm afraid of most people will not read the full post :P

I would like to thank to NeuralSoft for sponsoring this travel (in fact they paid all the costs). I'll do my best to use the things I learned at these event in order to help improving the Presea Enterprise project. Also, I would like to thank to my partners at Oxen, for taking care of the company while I took this last minute "vacations". I would like to thanks to Walter Trentín, for replacing me as teacher at the Android course. Finally, I would like to thank to my travel companions for doing this event a really funny learning experience.

Finally, for photos and fun stuff, you can look at JavaOne 2011 album on Oxen Facebook page. Juan José Gil (NeuralSoft), who traveled with me, also posted many photos on its album.

Due to my deficiencies in listening english and portuguese, I (or Adrián) may misunderstood something. If you realize that this is the case, please let us know.

But let's go to the technical stuff, which is the motivation for this post!

Day one - December 6, 2011

Keynote JDK Presentation - Speaker: Roger Brinkley
In this session, the speaker talked about the new JDK7 and the in-development JDK 8.

He presented many achievements reached in this release:
  • 4 new JSRs.
  • 1966 improvements.
  • 9494 bugs fixed.

He also enumerated new Java 7 features:
  • Diamond operator (<>) . It allows generic type inference.
  • Strings in switch statements.
  • Multicatch in try-catch blocks (|).
  • try-with-resources block for Autocloseable objects.
  • Fork/join framework, for multicore parallel programming.

Finally, the speaker mentioned some new features expected for JDK 8:
  • Lambda Project: It will enable parallel processing using lambda expressions for building map-reduce approaches. Parallelism can be applied to other collection operations, such as filter.
  • Default methods: They will provide Java wit multiple inheritance!!!

Instant Cloud: Just Add Java - Speaker: Bruno Souza
The speaker started this session talking about The 4 Hour Workweek book from Tim Ferris. He spoke about the current life plan: working a lot now for having benefits in the future. The speaker hade the idea  that the cloud could help to change this. I'm not sure about this, Oxen is an almost 100% virtual company  and we work a lot :P

After that, he went to more technical stuff. He spoke about the importance of continuous deployment and how it can help in  requiring less work hours (I remembered an anecdote when about when I was working at Neoris - I used to spend 1 hour a day to make a manual deployment to testing just because they didn't wanted throw out Visual SourceSafe).

The session continued with a demonstration of Google App Engine. The speaker talked about the trade-off required when developing for GAE and how easy is to mofe from testing to production. He also showed the GAE dashboard.

Next, he talked about Amazon Elastic Compute Cloud. Differently to GAE, Amazon provides virtual machines, where you can have, for example, preconfigured Linux installations or custom one. You are free, for example, for recompiling the kernel.

Amazon gives you a free VM for a year. But you can have more machines, activating them less time. I'm not sure if this time count is don by year, month, etc. Amazon doesn't charges money by request. They charges other things such as traffic, fixed IP, etc.

Amazon also provides the AWS Elastic Beans Talk (currently in beta), which gives you a preconfigured Tomcat. Here you can run an standard Java Web application which can be ported to another environments (however, unless you use specific Amazon APIs, such as S3 storage).

The speaker mentioned a cool feature: Elastic Map Reduce. This is an interesting service, so I'll try to give a look in the future.

Finally, the speaker enumerated some cloud tools that you could use in order to build your own cloud infrastructure, such as OpenNebula, OpenStack and RedHat OpenShift.

The demos where made using Git, Jenkins, ANT and ToolsCloud. Hey! They copied 50% of the environment that Juan José Gil built at NeuralSoft almost a year ago, however, Gradle is better than ANT!  - just kidding :)

JavaEE Applications in Production: Tips and Tricks to achieve zero downtime - Speaker: Fabiane Biznella Nardon
The speaker talked about 3 topics:
  1. Continuous Deployment
  2. Application Monitoring
  3. Debug in Production (I think that the word "profiling" would be more suitable)

Continuous deployment
Instead of using GlassFish cluster capabilities,  she used HAProxy load balancer. On a first demonstration, the cluster was configured in order to check each 5 seconds for server status. However, while this provided load balancing, it wouldn't ensure 100% uptime (because of the 5 seconds delay).

After that, she improved the approach. She added a servlet that indicated the state of the application (active/inactive) which could be turned on/off accessing an URL. With this tool, she modified the deployment script in Jenkins in order to change such flag, wait 10 seconds, and then make the deploy (and finally, it changed the status to active, however). This way, the 10 seconds window would give HAProxy (which was configured to check the status against this URL) time enough to be aware of such status change.

The speaker showed the HAProxy console during the deploy in order to show the results.

The deployment process was pretty slow, even before the 10 seconds delay configuration :S

She used Zabbix open source monitoring tool in conjunction with GlassFish REST interfaces. The servers were run using VitualBox. On each server, Zabbix agent was installed. In order to collect information, the agent was provided with an grep+awk+curl based script that parsed the output (a log?) in order to extract JSON data.

She showed the use of JConsole (provided by OpenJDK) and VisualVM (provided by Sun JDK) tools. She spoke about the security risk of opening the ports required by these tools and gave some recommendations about detecting classes with problems.

Web Applications and Wicket Scala on GlassFish JavaEE 6 - Speaker: Bruno Borges
The speaker (an Apache member) started talking about how the Java community started using standards such as the earliest JavaEE (J2EE at these times) specs and how the community build better, non-standard frameworks, such as Apache Wicket. However, he considered JavaEE 6 a good standard, so he told that it is a good moment to back to the "patterns" (he used this word, meaning "standards", I guess).

In addition, he showed how to make Web development using the following technologies:

The speaker told about the agility of the environment development such as Ruby, PHP or Python. He said that, regardless other aspects such as maintainability, these tools provide a way to build an application quickly.  It would be great to have such agility in Java and that is one of the advantages of the selected tools.

He talked about "pixel perfect" applications: building the UI exactly like the customer required it. On this aspect, he differentiated two clearly separated roles: programmer and web designer. He showed a funny cartoon describing their common personalities.

Typically, a Web designer will use tools like Balsamic for mockups and Photoshop for design. When building an HTML preview, the designer could use tools like DreamWeaver or Coda. Here, mos of the Java frameworks have a problem: they require a server in order to render the output. This is one of the advantages of Wicket: the view is defined in pure HTML, just adding some attributes for server-side component binding. This way, as the speaker explained, the UI layer can be divided into 2 layers, just like the rest of the layers. So, Web developers could focus on building the page look and feel and the programmers could focus on building server-side Java UI components.

He showed the inheritance and composition model provided by Wicket. This framework is server-side oriented, so it has components stored on the server session that reflect the UI state an behavior. Also, the framework provides Ajax support. Regarding this aspect, it provides an interesting feature: a user-transparent fallback mechanism for environments where Ajax is not supported.

After talking about the advantages of using Wicket, the speaker noted that this had a disadvantage over other agile frameworks: the Java verbosity. But don't worry, Scala is here to the rescue. He showed how to use Scala in order to build a DSL that made server-side component development considerably shorter (and keeping the advantages of type-safe Scala compiler checks).

Finally, he introduced JavaEE 6 to the combo in order to make it even simpler. He used the following specifications:
  • JSR 330 @Inject for IoC pattern implementation.
  • JSR 303 Bean Validation for (obviously) validation.

Finally, he talked about the Gamboa Project, which combines JavaEE 6, Wicket and Scala.

Adrián Biga asked why Wicket didn't gained more popularity. The speaker answered that the reason is because it doesn't has a company like Oracle, Google, etc... behind. Very brave, he was exposing in an Oracle event!!!

As a curious thing, he used Sublime instead of Eclipse/NetBeans.

Code Design: The Quality that Makes the Difference - Speakers: Guilherme SilveiraPaulo Silveira
In this talk, the speaker started differentiating pretty code (código bonito) from good code (código bon). He said that we should focus on building good code (this is, understandable, maintainable, etc.). He aimed to show some examples of code that could be considered pretty (I personally think that beauty is a subjective concept) but they wouldn't be good.

He spoke about the typical (very cascade-like) development sequence:
  1. Architecture
  2. Design (influenced by the architecture)
  3. Implementation (influenced by design)

but he also noted that some design concepts, such as IoC, influence in the architecture. On the same way, sometimes the implementation influences the design. So, the order can be the inverse:
  1. Implementation
  2. Design
  3. Architecture

He also said that when you change the architecture or the design, you just change documents, but the application is not changed. It is only changed when the implementation is changed. The conclusion here is that the only thing that exists is the implementation (architecture and design doesn't exists) and the most important thing are the developers. I disagree with the first reasoning, I think he was talking about architecture and design documentation, not the architecture/design themselves which are clearly reflected on code, configuration files, nodes, etc.

He made many examples of code that could be considered pretty, but can have problems such as:
  • Many branches, complex flow: He show a bunch of nested ifs and loops.
  • Readability: He show a one line sentence that made a lot of things. Conciseness is using the minimum amount of word, but without loosing clarity.
  • Broken encapsulation: He show how some frameworks such as Hibernate forces you to build classes with empty constructor, which could force you to expose internal state through getter/setters.

He refactored all the examples in order to correct those issues. Sometimes, in the one line example, the "refactoring" was as simple as adding line breaks.

Finally, the speaker gave some recommendations to take into account when writing code:
  • Dont't left garbage in code (this is, remove useless code, old and outdated comments, refactor bad designed structures, etc.).
  • Do pair programming.
  • Do Brown Bag refactoring.
  • Perform periodic code review.

As a curious thing, all the examples were made using Ruby instead of Java.

Day two - December 7, 2011

What's New in LWUIT - Speaker: Roger Brinkley
LWUIT stands for Lightweight UI Toolkit for JavaME and it was inspired by Swing.

This toolkit is not intended just for tablet/phones, it aims to be more generic. It can run con CLDC devices such as Blue Ray, etc. In order to achieve this, it is an efficient, low overhead framework.

The speaker showed the demo that comes with LWUIT.

LWUIT is 100% open source. It includes:
  • Styles and themes.
  • Renderers.
  • 2D/3D Animations.
  • Widgets (buttons, checkbox, etc.)
  • Transitions from one screen to another. The application can query for the available transition (for example, 3D transitions wouldn't be available on devices without 3D support).
  • Multiple fonts, which can be included into the application.
  • Tabs
  • Layouts
  • Scrolling (like UIScrollView in iPhone, which creates an area that can be scrolled moving the finger over the screen).

I think that it is a catch-up with iPhone/Android. But there's still room for the improvement. For example, the animations shown in the demo were made programming a for loop, while iPhone and Android platform provide simpler approaches (animatable properties, interpolators, etc.). Also, the UI didn't looked very nice to me.

After the demo, the speaker talked about the new features of LWUIT 1.5. Such features include:
  • A GUI builder build on top of NetBeans. He said that it doesn't require code generation but I'm not sure what this means. It would be 2 way, just like Window Builder Pro?
  • Component tabs. They replace the LWUIT 1.0 tab panels and look a lot like iOS tabs.
  • Slider control, which can be customized using themes.
  • Spring inspired MVC+Renderer model.
  • JavaSE port, which eliminates the need for an emulator and enables using any existing IDE, debugger, profiler, etc. However, this is not a replacement for on-device testing.

Finally, the speaker talked a little about LWUIT4IO, which is an API that provides storage, network, JSON and XML parsing, etc. I don't know this framework in detail, but mixing UI and IO doesn't sounds like a good idea. I would have called it LWIOT (Lightweight IO Toolkit) instead. The speaker mentioned the existence of an Facebook/Ad integration component, which is built on top of LWUIT4IO.

Coding Dojo with Java 7 - Speaker: Otávio Gonçalves de Santana
I arrived a few minutes late to this session. The speaker was making some questions to the audience. He asked who used to do sports and who used to play music. After that, he asked how many hours a day they used to spend in training. He received many responses: 2 hours, 3 hours, etc. Finally, he asked: "and how much time do you spend in training your programming skills?". No answer. The conclusion: the programmers usually doesn't do training.

After that, he mentioned some techniques that programmers should implement:

Also, he briefly explained some Coding Dojo meeting techniques:
  • Prepared Kata: 1 person solves a problem in front of an audience.
  • Randori Kata: 2 people solve a problem in front of an audience.
  • Retrospective: People talk about lessons learned along the project.

After explaining the basis for Coding Dojo, he talked about Java 7 new features:
  • Project Coin (JSR 334).
  • New file API (JSR 203).
  • Concurrency and collection improvements (jsr166y)
  • Dynamic method invocation (JSR 292). Regarding this point, the speaker showed some benchmarks. A test comparing reflection with InvokeDynamic showed an improvement of 34x the speed. Another test with other operations including String concatenation, etc. showed an 15x improvement.
Finally, the session ended with a practical exercise: two programmers from the audience solving a problem using TDD (Randori Kata?).

HotRockit: What to Expect from Oracle's Converged JVM - Speaker: Marcus Hirt
The speaker talked about changes and features that will be found on HotRockit:
  • The JRCMD command line tool will be replaced by JCMD. This tool allows monitoring a running application, by querying different information about the JVM.
  • The JMX Agent will be updated. Now it can be started inside a previously running JVM
  • JDP (Java Discoverable Protocol) allows detecting when managed JVMs are started or shut down.
  • JRockit Flight Recorder (profiler) will be renamed to Java Flight Recorder.  On this point, the speaker showed a very cool Eclipse plugin for using this profiler.
  • UI builder allows building custom dashboards and screens.
  • Memory Leak Server: is a on-line heap analyzer.
  • PermGen will be removed. This memory space will be dynamic. The user will no longer need to tune it manually previously to running the JVM. This change will not have impact on performance.
  • Deterministic Garbage Collector. The user can set a max pause time target and the JVM will try to meet it (using statistical information, I guess). Also, the compiler will do some optimizations on this aspect. Due to this best effort approach, it is called soft realtime GC.

JSR 343: What's Coming in Java Message Service 2.0 - Speaker: Adam Leftik (Arun Gupta was listed in the program)
This session was about the JMS spec update (JSR 343), which includes several improvements over the previous one (JMS 1.1, year 2002!!). However, it will be backwards compatible.

JMS 2.0 will include a simplified API, which will be based on CDI (JSR 299). This will simplify, for example, accessing resources, since they can be injected (RI - Resource Injection).

It will have multi-tenancy support, following the approach of JavaEE 7: to enable creating Platform as a service (Paas) solutions. For example, you will have te ability of defining destinations using annotations.

In order to make it compatible with JDK 7 new try-with-resources block, the Connection class will implement the AutoCloseable interface.

The spec will include better Application Server integration facilities:
  • Standar API for allowing any JMS provider yo work inside any Application Server. There will be optional interfaces.
  • Providing a JCA connector will be mandatory in JMS 2.0.

Also, it will include new API features:
  • Delivery delay.
  • Asynchronous ACK.
  • Batch message delivery and reception.
  • Topic hierarchies. This way you can subscribe to a group of topics using wildcards.
  • Multiple consumers on the same topic subscription.
  • JMSXDeliveryCount will now be mandatory.

During the presentation, the speaker mentioned the forthcoming EJB 3.2 specification many times.

The speaker showed the JMS spec URL ( and asked the audience for participating with the community.

Embedded Java: Smart, Connected, Pervasive - Speaker: Terrence Barr
The speaker talked about JavaME promises for 2013. It will be integrated with JavaSE.

Mostly, the session was about Web integration.

An interesting thing shown in the talk was GSMA OneAPI GPS integration.

How to Refactor in JDK 7 - Speaker: Greetjan Wielenga
This session was about how to refactor code in order to take advantage of improvements in Project Coin. This project includes small changes which doesn't have impact into the JVM. They have impact, however, on the compiler, but it will keep building standard Java bytecode.

The speaker explained some new characteristics of Java 7 included in Project Coin:
  • Try-with-resources.
  • Multi-catch.
  • Strings in switch statements.
  • Diamond operator.

and showed how to migrate existing code in order to take advantage of such new features using NetBeans. He used a really cool NetBeans tool: Inspect And Transform. This tool analyzes code and makes automatically the requested changes, where possible. This tools doesn't just make Project Coin migration, but it has a lot of more options for code improving. The templates are similar to Swing tools.

It will be the day when the programmers will be out of work? :)

I think that Sun/Oracle is putting a bigger effort in supporting migration than they did with the last language update (JavaSE 5). This is a good thing.

Also, the speaker showed a framework for building NetBeans apps. This framework provides menus, toolbars, etc. (just like Eclipse RCP, I believe). Also, he showed how NetBeans can package the application using different formats: Windows EXE, Mac OSX app, ZIP, etc.

As a curious thing, the speaker had technical problems with its notebook (the VGA connection didn't work), so Juan José Gil provided him its notebook, which was running an Ubuntu installation. Don't worry Juanjo, I'm not going to publish the video here :)

Interfacing the Interface: JavaFx 2.0, Wiimote, Kinect and More - Speaker: Simon Ritter
The speaker started the presentation saying (among other things) that the keyboard/mouse days are gone. Now is the rise of the gestural interface.

The agenda of the session was:

The speaker showed the evolution of man-machine interfaces. From keyboard (which are used since the earlier typewriters) to mouse and muti-touch. He also showed many of the interfaces used in games, from Atari joysticks to Nintendo gamepads, Playstation controllers, dancing carpets, etc. He said that Nintendo Wii brought gestures to gaming.

He talked about game programming tool based on Java. He said that JavaFX 2.0 is simpler than its predecesor and is not based on scripting (it is based on a Java API). Also, he mentioned some cool features such as binding, which allows updating the screen when something changes (for example, in order to show movement). He also talked about JavaFX 2.0 improvements over 1.x, which is more like Swing (a Java API instead of scripting). Such improvements, alongside plugin support, will increase productivity. Another cool JavaFX 2.0 feature is Mac Os X support.

JavaFX is limited to 2D graphics, which is a problem if you want to do 3D gaming. It will have 3D support, but this work is still in progress. On the other hand, jMonkeyEngine provides good 3D support but it is harder to program and has poor backward compatibility (I have had problems with API change when playing with jMonkeyEngine, but I was using the 3.0 alpha version).

On the practical demonstration, he showed how to use JSR 82 (Java API for bluetooth) in order to interfacing with Wiimote control.

Also, he explained how they built a glove that detects hand position. He presented electronic diagrams, which I obviously didn't understand :)

The cooler part was when he explained how Kinetic works. He showed the different steps that Kinetic uses in order to detect the user body position, arms and legs movement, etc. An intelligent algoritms infers body parts location. Regarding to programming it in Java, he said that there is no direct support, but there is a Java Wrapper for OpenNI C++ driver.

He demonstrated all the stuff working together: WiimoteKinetic, glove and VR glasses. The screen showed images detecting head movement, a blue silhouette detecting the human body, a skeleton reflection user movements, an scenario built using jMonkeyEngine, etc.

When the session finished, I asked him: Do you need somebody working for food on your project? He laugh, but I was being serious!!! :P

Day three - December 8, 2011

Servlet 3.1 update - Speaker: John Clingan
This guru started with a phrase: "JavaEE is not a product, it is a technology".

He presented improvements over Servlet 3.0 specification.
  • Simplified use. Servlets, filters, listeners have new annotations for configuring them, for example, @WebServlet. It will be simpler than most existing frameworks (Struts was mentioned, which may not be a good example).
  • Service metadata will also be redefined.
  • NIO API will allow:
    • Thread reusing.
    • Callback-oriented approach.
    • WebSocket (for Ajax programming)
  • Protection against CSRF and XSS Web attacks. OWASP support was not confirmed by the speaker.

The specification is expected for year 2012 (subject to change).

Develop, Deploy and Monitor a Java EE 6 Application with Clustered GlassFish 3.1 - Speaker: Arun Gupta
Besides the Java EE 6 topics, the speaker talked about new Java EE 7 features. It will be ready for cloud development and it will enable Platform as a service (Paas) development.

Obviously, GlassFish container will support this platform.

Another areas where this container will have improvements include:

  • Load balancer.
  • Clustering.
  • Database initialization.

Ease of use will also be improved.

JavaFX Architecture and Programing Model - Speaker: Joe Andresen
This session was about JavaFX 2 main features. The speaker said that JavaFX provides a common API for applications to be run from:

So, you can write an application that runs on multiple environments. You even can call Javascript functions (non-browser environment use a Javascript engine, I guess). The speaker showed an example which I didn't like too much: a JavaFX application which extended the base application class an included the Java main method for running from command line. It would be nice to have the application and environment stuff like that clearly separated.

As I mentioned on a previous session, JavaFX 2.0 doesn't include scripting. I didn't know this, but I think it is a good decision. Why reinventing the weel? There so many language options in Java... Scala, Groovy, Rhino, Jython, just to name a few.

The presentation continued describing the main JavaFX parts.

Scene Graph
The scene graph is a Directed Acyclic Graph (DAG) of nodes. It contains the GUI representation which must have low to mid memory foot print.

Each node has a peer in the UI model. The scene graph and model update are done in different threads, so they didn't interact in a direct way. There is an automatic synchronization mechanism, wich does periodic "pulses". This is the only point where both threads communicate among them. JavaFX incorporates automatic property binging.

Also, the speaker showed a technique called "bucketizing", which consists in making properties lazy, in order to avoid unnecessary memory usage. He did it manually, coding an if sentence in the getter. It would be great if JavaFX could provide automatic support for this aspect.

The speaker showed keyframe based animations, which remembered me how Flash animations are programmed. Also, he showed many kind of transitions between screens which are available in JavaFX 2.0. One of the merits of the iPhone is to have become transitions fashionable :)

JavaFX provides a strict separation of model and view. As explained before, the UI and the model state are updated on different threads.

The speaker said that JavaFX has platform specific feel and cross platform look. I'm not sure what this means. Maybe the user interaction (events, keyboard use, etc.) is specific to platform and the UI style is the same on all platforms.

The controls can have styles (which is no surprise, the only mainstream, modern UI technology that doesn't incorporates styles is iOS).

The speaker mentioned an interesting feature that  about layouts: layout bounds can be different of actual bounding volume. In my experience, when building iOS applications, you can build cool interfaces but you don't have layouts, so you must do many things bay hand. On the Android platform, you have layouts, but usually the interfaces where the controls fixes into boxes are not considered "cool". Maybe this feature adds a new approach.

JavaFX is able to manipulate DOM and HTML from Java code. Also, it can react to DOM and HTML events (from Java too). Also, it allows embedding Web content (for example, Google Maps).

The JavaFX web support is based on WebKit, which is widely supported. This way, there is no need for building in Java functionality already available in WebKit.

JavaFX uses native codecs and plays common media formats. The scheme is designed in order to minimize CPU load and battery use.

The media is synchronized with scene graph using standar JavaFX mechanism.

The Heads and Tails of Project Coin - Speaker: Dan Smith
The Project Coin includes small changes to the Java language, which doesn't modify the JVM.

The speaker showed many examples using NetBeans. He also enumerated different considerations and approaches taked into account for each new feature.

Regarding String in switch statements, he showed how the code is translated into non-java 7 code in order to create standard bytecode. A switch using the String hashcode is built. Inside each case block, the equals method is used to determine if the string is the expected one. I think that for many case blocks scenario this approach would provide a better performance over generating a pure if-equals-else chain.

He talked about @SafeVarargs and people made a lot of questions about exceptions. Not surprisingly, methods with this annotation can't be overridden, since it would allow changing safe code by non-safe code using inheritance.

The speaker talked about the try-with-resources block. He said that JDBC 4.1 will include support for AutoCloseable interface.

Regarding the diamond operator, different approaches for type inference were proposed (for example, Scala uses a different approach: the type is specified in the construction, not in the variable declaration). They ran an analysis over millons of existing Java lines of code. Both approaches showed to be equally good, so they opted for specifying the type into the variable declaration, which they considered having better evolutive and maintainability characteristics.

In order to show how small changes can produce big impacts, the speaker showed a cross reference graphic built from Java spec sections dependencies. The graph grown considerably after adding Project Coin changes.

Finally, the speaker commented some changes expected for JDK 8. He talked about some big ones (Lambda and Jigsaw) and the smaller ones (JSR 308 - Type Annotations). Small changes to the language will be considered in JDK 8 too.

FileNotFound: A Tour of the File System API in JDK 7 - Speaker: David Simms
The filesystem improvements in Java 7 builds a comprehensive interface to filesystem. As Adrian Biga (he was sit right next to me) said, mos of these functionalities were already available in C/C++. It's good to know that Java is catching-up.

The DirectoryStream class allows traversing directories in a sequential way. You can apply filters to it (in a similar way to Google Guava predicates).

Also, support for symbolic links was added. The speaker showed such support in two classes:

  • Path.isSymbolicLink: Checks if the path is a symbolic link.
  • Files.createSymbolicLink: Creates a symbolic link.
  • Files.isSame: Checks if a file is a link to another.

Also, there is support for file copying (Files.copy) and moving (Files.move). No more input-output-stream-while-loop-byte-array-buffer stuff. I know, there are a lot of frameworks which solve this issues, but everybody wrote a file copy routine sometime.

File attributes can be accessed and modified too. This includes permissions, ownership and timestamps. There even are classes for accessing/writing them in a bulk mode.

The method Files.walkFilesTree allows doing recursive operations over a directory tree. It uses the Visitor pattern.

A WatchService interface provides notifications on file changes. But this is not limited to files, the watching service is more generic and allows notifications of many types. There is even support for native notification mechanisms.

Finally, the file system implementation was abstracted into the FileSystem class. The FileSystemProvider acts as a factory for FileSystem instances. The JDK includes, as example, a ZIP file system implementation.

Project Jigsaw: Putting it Together - Speaker: Matherey Nunez
"The unit of reuse is the unit of release" - Robert "Uncle Bob" Martin

If you think about this phrase and Java, you will realize that a package is not the same as a JAR file. Something must be wrong.

I was late for this session, since I forgot that it was at the same time that a JavaEE Web Profile session. When I realized about that, I went from such session to the Jigsaw one. Anyway, I just miss the first slides.

Jigsaw will be released with JDK 8. It will provide many packaging formats, which can include digital signatures. This till include web publication concerns. Obeying a requirement, the Jigsaw implementation will not have impact on performance.

JavaSE will be converted to Jigsaw (it will be "sliced", but not in a fine grained way). The speaker showed a dependency graph just to give an idea of how much complex this task would be.


  • Grouping: Modules are grouped using a file. This file contains module declarations and it can have an entry point (a class for executing the module).
  • Dependency: The scary JAR hell can be avoided by specifying dependencies in the module. Dependencies include the version of the referenced modules. Such versions can be specified as a range.
  • Encapsulation: Jigsaw allows specifying which packages are visible to outside.
  • Splitting / Aggregation: A module can be split into mode modules. In a simliar way, many modules can be aggregated into one using a feature called "module aliasing".
  • Module Files: They are compiled using the javac -modulepath command. The module (a file with .jmod extension) is built using the jpkg -modulepath command.
  • Library: Is a mechanism for grouping many modules. Library delegation is also supported. This way, modules can have a hierarchical resolution approach.
  • Repositories: Jigsaw will include a repository mechanism, which will allow automatically downloading modules not available locally. It also allows recursive dependency resolution, just as Debian packages.
  • Native packages: They will be supported in Jigsaw.

Old applications will be compatible given they didn't depend  on JRE/JDK internal structure and only JavaSE API is used. For example, rt.jar and tools.jar files will be removed.

A natural question would be "why not using OSGi?". It already provides a dynamic module system. The  arguments exposed against OSGi are the lack of support for native packages and the package dependency approach.

I´m not sure about the second point. OSGi provides both dependencies mechanisms: by package and by bundle (the module equivalent). So it is up to user to choose the one to be used. However, I heard people saying that package dependency specification is recommended. But Jigsaw doesn't support this approach.

The argument against package dependency specification is that it can produce inconsistencies when packages are imported from different modules. For example, a module exports package A and B and another one exports B. Which package should a client that requires both, A and B import?

Finally, Jigsaw doesn't include something to OSGi dynamic services. I asked and the answer was that modularity and dynamic services are orthogonal concepts. Jigsaw could be used with any other dynamic service framework. So we'll not migrate Nibiru to Jigsaw until we'll see how this could be resolved :)

Project Lambda: To Multicore and Beyond - Speaker: Dan Smith
The JSR 335 spec (Project Lambda) aims to provide closures to Java language. This will help when programming for multicore environments.

The speaker showed some examples. An interesting detail is that local method variables used into closures must be final (just like when using inner classes). Another detail is that inside a closure, the "this" keyword will reference the containing instance, not the closure itself (unlike inner classes).

There are some restrictions. For example, the JVM is free of creating as many closure instances as it need for a given task. Maybe this restriction is intended to support parallel processing.

The speaker talked about the problems they were faced when making the Java API evolve. Changing interfaces is a complex task if you have to maintain backwards compatibility. They solved the issue by adding "default" method implementations to interfaces (much like Scala traits). This will add multiple inheritance to Java, which was avoided when creating the language! But as the speaker said, no state inheritance will be allowed (since just behavior can be specified into interfaces). There shouldn't be problems with diamond inheritance scenarios. However, this will lead to new rules to be applied for runtime method resolution.

The collection classes and interfaces will be modified in order to support closures, but (as said before) maintaining backward compatibility. Iterable interface will be modified in order to support lambdas. According to the case, eager or lazy strategies will be adopted.

Also, the collection framework will provide support for parallelism. Collection processing could be parallelized using the fork-join framework found in JDK 7, but this could be too low level. Instead, collections will be able to parallelized themselves. In a similar way, the Spliterable interface will be modified. The collection implementations that already implements this interface provides a natural way for splitting them.

Wow, you reached the end of the post... you deserve an award!!!  Too bad that Oxen doesn't have budget for that :P