Crazy IntelliJ with Play Integration

I am always a fun of Intellij until it throws at me a bomb when working with play framework.

This post is served at the purpose of solving (at least listing) some of the problems I encountered.

To reproduce:

  1. Play Console [p>]: play create myapp with Java by `play new [a project name]`
  2. Import play project using IntelliJ through play console by `idea with-sources=yes` and then import from intelliJ
  3. Follow the “todo” play tutorial

Problem List:

    • “cannot resolve symbol” for reverse routing: As in return redirect(routes.Application.tasks());. It is related to the fact that part of folder target, i.e. target/scala-2.10/src_managed (some scala code compiled by play compile) is not included as the source. So do “play compile” before importing into IntelliJ, and IntelliJ will notify you that

You have useless source roots which may corrupt resolve for play 2 framework in your project. Don’t delete but ignore the warnning.

Execute the following steps to update your dependencies [Solution provided in Github]:

    1. Run the update task from your play console
    2. Remove the .idea_modules and .idea/libraries directories
    3. Run the idea with-sources=yes command in the play console
  • “Cannot find Tools | Play with Playframework”: for Play 2.0 Support with version 0.36.431, the play console is not seen under tools, but accessible from context menu of play-based modules like controller application, by Run Play 2 App after installing play 2 plugin.
  • When running test from intellij, an error the underlying test class is not found: check the folder test-classes, it is possible that the underlying test class never get compiled before running, so do test:compile without actual running the test. It is a bug reported here. So for now, do play>test and then run test. — Unsolved. Relative background of compiler process
  • “cannot resolve symbol” for reverse routing in “index.scala.html”: @form(routes.Application.newTask()): simply add class_managed to the class path.
  • to run test when accessing db is needed: put it with in a fakeApplication() and Runnable. Refer here
  • type mismatch @inputText(taskForm("label"))Unsolved

Working on play 2.3.0 with IntelliJ

Install plugin, scala, Play 2.0 support, SBT(optional)

activator> compile > test:compile > idea withsources=yes

open build.sbt > IntelliJ asks me to associate this build.sbt with project, do import

change IDE language levels if necessary.

Check IDE preference for play, sbt, scala.

Run a JUnit method test by IDE; Debug with Play 2 better than activator -jvm-debug 9999 run.

IDE will automatically create root and root-build in project modules.

NOTE: 2.10.4 is used instead of scalaVersion specified in build.sbt (Unsolved)

Install https://github.com/jrudolph/sbt-dependency-graph following the readme ‘s two steps, and within play run the tasks.

Warning: Class path contains multiple SLF4J bindings. slf4j-log4j12 conflicts with logback. Use dependency-graph to identify who is using log4j.

Advertisements

Understanding Multi-thread, Jersey and Jetty

I think it belongs to the basic understanding between request, thread and web container. Although similar concept could apply to other web  containers or REST frameworks, I like to be modest about my understanding and the post here is limited to the discussion of Jersey and Jetty.

There is a common doubt on how requests interact with web servers through REST. I want to know how each client request is being handled by a thread spawned by web container and what are those configurations I can play with to tune my REST service performance. What I did is simply:

  1. maven a web app project and use jetty-maven-plugin for my test. It is relatively easy since I just issue maven jetty:run then I can test the service without any hassle.
  2. write my simple service by listening @Path defined and write web.xml files to define my servlet. In the service, I wrote a method getDoc() to retrieve a local file and return a Response with the local file. I want to know what is going on when multiple client requests hit my REST. I purposely to delay the service execution to make it longer by adding Thread.sleep(a processing time).
  3. use curl to simulate my naive client by calling curl --connect-time --max-time -G -s -w %{time_total}:%{time_connect}\\n http://localhost:8080/rest/get. The GET request will waiting X seconds for getting the connection, and Y seconds for entire operation, otherwise a timeout will interrupt the client request. I tried with 3 such clients simultaneously.

The first finding is that Jetty takes in 3 clients, and processing them one by one simultaneously (i.e. less than 3 clients x processing time), and they are interleaved with each other. Each clients are blocked for slightly more than the processing time. That means the default jersey service is a Synchronous Web Service. If the client is going to be held too long and client is better off to carry out other tasks, an Asynchronous Service is better. Jersey supports this.

So how to play with connections and what is the default behavior of jersey service? We need to look into the configuration of jersey. Jersey configuration can be put in the pom.xml within the jersey plugin configurations.

<Configure id="Server" class="org.mortbay.jetty.Server">
  <!-- required configuration -->
  <!-- connectors -->
  <!-- handlers -->
  <!-- webapps/contexts -->
  <!-- optional configuration -->
  <!-- threadpool -->
  <!-- session id manager -->
  <!-- authentication realms -->
  <!-- request logs -->
  <!-- extra server options -->
</Configure>

I played with thread pool to limit the max and min pool size, but the number of connections seems not controlled by that. What I did was limit the pool size to min and max as 1, and 3 clients still got processed simultaneously. The next step is to know more about the http connections and jetty configurations.

Interesting list:  Quora, Newbie Guide to Jetty, Jetty Optimization Guide

Jersey Client Discovery

Jersey Client has two different implementations for Synchronous and Asynchronous resource.

    • For Synchronous resource, the key invocation call lies in the portion of createDefaultClientHandler() which returns a new instance of URLConnectionClientHandler. The new returned handler has a method handle(ClientRequest ro) which essentially opens a connection by java.net.URL. This means each time a request is called like in webResource.get() a new URL connection is opened. So for Synchronous resource, the number of threads holding a http connection is equal to the number of requests.
    • For Asynchronous resource, the key invocation call lies in the portion of AsyncWebResource$handle returning a FutureClientResponseListener. The constructor of AsyncWebResource calls a getExecutorService() which follows a singleton pattern and retrieves the only instance of a Executor pool by either a newFixedThreadPool or a newCachedThreadPool. Before return, AsyncWebResource$handle calls executorService.submit() and sumbit a runnable to the executor pool. So for Asynchronous resource, the number of threads holding a http connection is determined by the pool.

This is what will happen when concurrent threads invoke Jersey Client, either one http connection per request or connection threads controlled by executor service.

Studying Builder Pattern with Inheritance

Just to pay some tech debts.

Builder Pattern alone is easy to understand, but not so much if inheritance is concerned.

(Part I) understand the problem

First, we need to understand how inner class works.

Given A <— B (B is subclass of A), and A.Builder is inner class of A, and B.Builder is inner class of B. If we have A.Builder <— B.Builder as well, what is the consequence of following calls?

  1. new B.Builder(): A and B are not created, but A.Builder() is called and then B.Builder() is called. new B.Builder().getClass()returns B$Builder;
  2. new B.Builder().Bmethod(): call B’s Bmethod (who returns B$Builder); getClass() returns B$Builder;
  3. new B.Builder().Bmethod().Amethod(): call A’s Amethod (who returns A$Builder); but getClass() still returns B$Builder();
  4. new B.Builder().Bmethod().Amethod().build(): call B’s build() (who returns new B(B$Builder)) because Amethod return “this”, by no means compiler knows who is “this”; Once B(B$Builder) is called, first B’s super(B$Builder) is called, and then B’s rest is called. So finally, B is created and returned. However, this only happens in runtime. In compile time, the build() that is involved is A’s build() . So casting has to happen so that compiler can pass.

Compile time follows the definition/ signature strictly, but in runtime, jvm finds the nearest method to call. Method getclass() only returns the runtime class, and down cast (from parent to child) is only possible if in runtime parent type is actually a child object. Compiler does not know what is “this” it is referring, so runtime cast must be done.

Therefore, as Eamonn McManus mentioned in the post,

new B.Builder().Bmethod().Amethod().build() will be compiled, but the sequence requirement is not a pleasant requirement, and a down cast must be there in order to do sth. like

B b = (B) new B.Builder().Bmethod().Amethod().build();

(Part II) explore to solve the problem

In summary, the problem happens because in parent class A$Builder we have a method Amethod who returns this, but in compile time, compiler correctly thinks “this” is A$Builder, but what we hope is that “this” is B$Builder. So we need a trick to cheat compiler, here we go,
Continue reading

High Frequent Shortcuts

Tmux

tmux a -t session.name -> attach an existing session

tmux list-keys -t vi-copy

tmux list-buffers
tmux show-buffer -b n
tmux save-buffer -b n foo.txt

Vim Shortcuts

dG -> delete till the end; Combined with gg and dG to delete all lines
dgg -> delete all from beginning till current line.

IntelliJ Common Shortcuts

Ctrl + j -> bring up live templates

Ctrl + y -> delete current line

Ctrl + Alt + i -> indent current line

Ctrl + Alt + l -> auto format code

Ctrl + b -> go to declarations

Ctrl + Alt + Left/Right -> navigate files

Ctrl + Shift + F7 -> highlight usage

Alt + F7 -> find usage

/** +  Enter above a method signature -> create Javadoc stubs

Bash Shortcuts

Deleting: because I type a lot of rubbish…

Ctrl + w –> delete previous word

Alt + d -> delete next word

Ctrl + k -> delete from cursor to the end of line

Ctrl + u -> delete from cursor to the begin of line

export JAVA_HOME=`/usr/libexec/java_home -v 1.7`  -> switching java version in Mac

Install WordPress in the hard way

Install LAMP (mysql, apache, php)

refresh the linux repository
apt-get update

Install apache2
apt-get install apache2
test apache2 by going to localhost:80

install php with apache2 support
apt-get install php5 libapache2-mod-php5
test php by writing info.php in /var/www/ folder.
restarting apache2 is needed.

install mysql
apt-get install mysql-server mysql-client libapache2-mod-auth-mysql php5-mysql
restarting apache2 is needed to find mysql module installed.

Install WordPress

using virtual host
locate the default file in /etc/apache2/sites-available.
copy it to a the site name “wordpress”
Change the document root to the wordpress unzipped directory.
change accordingly.
disable default and enable the created.
a2dissite default && a2ensite mysite

Continue reading

[Day 2] Get to Know Assembly

[Day 2] Get to Know Assembly

I was curious to know the relationship between Assembly (by NASM) and the generated binary.

Findings are summarized as follows, note: guidelines only, it seems not correct for one parameter takes 2 bytes at least.

  • ORG generates no binary code. Actually like labels, it is only considered at compile time.
  • One label in the operand takes 2 bytes.
  • JMP takes 1 byte, 0xeb
  • BD, DW, DD takes no bytes, but they generate space for holding 1, 2 and 4 bytes respectively
  • MOV takes no bytes, but registers AX, pointers SP etc takes 1 byte each
  • MOV AX,0; where parameter 0 take 2 bytes
  • One register (e.g AX) can hold 32 bit (for 32-bit CPU) = 4 bytes. There are 8 registers (AX, BX, CX, DX, BP, SP, SI, DI), 4 bytes x 8 = 32 bytes in total, too little storage
  • [SI] from memory takes 1 byte

“30 Days to Make Your Own OS” on Ubuntu

By far, it is the only English version of recording steps taken to follow the Japanese book.

This post serves the purpose of recording and discussion of my own progress of self-learning. The code is synchronized with my Github repository.

Day 1

———————-

Install Image on QEMU on EC2 Ubuntu

Testing images on QEMU is a common task. This section talks about how to install images on QEMU in the linux environment.

Qemu is a visualization emulator running on hosting os (ubuntu in my case). Qemu has a universal distribution with ubuntu. Type qemu- will list a batch of qemu related binaries, however qemu commend itself is not enlisted.

P.S. I tried to install from qemu official git. It ended up with configuration failures with my EC2 ubuntu.

Turing to the default qemu with Ubuntu, use qemu-system-x86_64 to replace qemu.

qemu-system-x86_64 -fda workstation/OSASK-Linux/src/helloos.img

return error

Could not access KVM kernel module: No such file or directory
failed to initialize KVM: No such file or directory
Back to tcg accelerator.
Failed to allocate 402653184 B: Cannot allocate memory
Aborted (core dumped)

There are two errors, no KVM and no enough memory.

Continue reading

Get to know the Java Best Practice

Although it should be learnt the other way around, I find it interesting to get to know some bits of enterprise best practice before I can appreciate.
It is true that I can only know the real benefit after I pay the price, but it won’t stop me from knowing those rules of thumb beforehand.

Reference:

Maven Configuration

  • Create a project pom.xml  as <packaging> pom, which is defining all the project dependencies for modules, specified in <modules/>
  • Specify all version numbers in parent’s pom file’s properties
  • Specify <dependencyManagement>, and inside <build> specify <pluginManagment>, if sub modules exist. It is used for child pom to transitively use parent pom configurations. Outside <*Management>, Sections <dependencies> or <plugins> must exist for maven to download actual jars.
  • Fix java compiler version by maven-compiler-plugin as 1.6
  • Enforce maven version between 2.2.1 to 3.1.1 by  maven-enforcer-plugin
  • Replace common-logging and log4j by slf4j, and don’t use logging utility either in the code. Q: Why is that. The reason is here.
  • Ensure no SLF4j 1.5 or 1.6 Q: Why? Since they do not work together?
  • Don’t allow spring framework 2.x and 3.0. Q: What is the difference from 3.1?
  • Add SLF4j 1.7 for Logging, and prefer logback over log4j, and enable root exception logging.  Q: Why prefer? Reason is here
  • Ensure source encoding across developing environments <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>, as well as <project.reporting.outputEncoding>UTF-8 for reporting plugins like surefire, failsafe
  • Solving maven dependency conflicts by explicitly defining in <dependencyManagement> after analyzing dependency conflicts using IntelliJ diagrams marked in red.
  • Make developer environment loaded from ./config directory instead of module’s resource folder resource folder is supposed to load more static configurations like logger configs. Since developer environment is changing more likely between individual developers, like test host url, password, or temporary data folder, these configs should be put inside config directory and make it ignored by git. Therefore, static resource is loaded by class().getClassLoader().getResourceAsStream(), while for config directory, make maven look for project.home property (by -Dproject.home=…) and put ${project.home}/config into class path before calling with getResourceAsStream(). The additional classpath can be added by surefire’s additional classpath element.
  • Pay attention to transitive conflicts if during runtime you encounter error like no such method. It is a highly chance that one lib is using a common transitive dependency with different version. Finding the bugs by looking
    mvn dependency:tree -Dverbose //with keywords google which jar has classes related to the error "no such method"

Test-Driven Design

  • Class naming: ClassNameUnderTest+Tests, i.e. EventDaoTests
  • Method naming: UnitOfWork_StateUnderTest_ExpectedBehavior; reference
  • Naming practice for Junit
  • Using Junit for unit test; examples (testNG across class dependencies is too messy for unit test)
  • @BeforeClass/@AfterClass static method for creating/releasing expensive objects, like DB connection objects.
  • @Before/@After for creating/releasing common test objects, usually private fields for test class. To ensure no dependency over test methods (each test method is running on a separate test class instance, and releasing by assigning null to objects and Garbage collector will collect them. It is not needed maybe.)
  • Declare private static final SLF4J Logger at the beginning of test class
  • Separate integration test with unit test. Referred organization of IT with original blog and useful practice. I prefer the use of systest, and in maven I need to configure surefire (rather than failsafe) with profiles (profile flags -P must be enabled when mvn verify. Details can be found in the comments of codehaus)

Design Patterns and Packaging Concepts

Code Reference

      • Integration Test Profiles
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<includes>
<include>**/*Tests.java</include>
</includes>
<excludes>
<exclude>**/systest/**</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>

<profiles>
<profile>
<id>itest</id>
<activation>
<property>
<name>itest</name>
</property>
</activation>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<executions>
<execution>
<id>surefire-it</id>
<phase>integration-test</phase>
<goals>
<goal>test</goal>
</goals>
<configuration>
<excludes>
<exclude>none</exclude>
</excludes>
<includes>
<include>**/systest/**</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>

Importing Data into Neo4j

Importing data into Neo4j has two kinds of conditions, either installed into a fresh new instance of neo4j server, or into an existing server.
Into an existing server requires more effort and it is should be the technically challenging issue.
In this post, we only focus on the former.

Maxdemarzi’s blog left us a detailed steps of importing data as a fresh start, with the tools provided by Michael Hunger.

It should work as smooth as you could expect.
I am just going to log what I found troublesome.

Continue reading

Tagged

Installation of Phabricator

This post records my procedure of installing Phabricator (P for short).

mkdir /usr/local/phabricator

install_ubuntu.sh

Till this far, P is installed, but configuration is more important.

First, we need to set up the web access to P.

cd /etc/apache2

create a P site

cd sites-available/

cp default phabricator

write P as

<VirtualHost *:80>

ServerAdmin webmaster@localhost

ServerName MY-PC-ADDRESS-ON-THE-NETWORK

DocumentRoot /usr/local/phabricator/phabricator/webroot

RewriteEngine on

RewriteRule ^/rsrc/(.*) – [L,QSA]

RewriteRule ^/favicon.ico – [L,QSA]

RewriteRule ^(.*)$ /index.php?__path__=$1 [B,L,QSA]

<Directory “/usr/local/phabricator/phabricator/webroot”>

Order allow,deny

Allow from all

</Directory>

ErrorLog ${APACHE_LOG_DIR}/error.log

# Possible values include: debug, info, notice, warn, error, crit,

# alert, emerg.

LogLevel warn

CustomLog ${APACHE_LOG_DIR}/access.log combined

</VirtualHost>

Continue reading

Tagged ,