Hosting A Static Website In Google App Engine - Eclipse, Java

June 29, 2015

A few months ago, I played around with Google app engine. Somebody asked to set up a personal website which is nothing but a collection of static pages, images and JS widget to do a carousel/gallery. I tried my hand at Google app engine. In another instance, I hosted Google Course Builder on app engine. In both instance, my experience is smooth. Used Google app engine launcher.  Configured .yaml file (Replace application name with the unique application name you set up on google.)What if you want to setup your site using Java on AppEngine. Generally, java developers start with Eclipse. Generally, we follow the tutorial by Google.

The question is how to put our static files on AppEngine and point index.html as the startup page. Answer - include static file location to app-webengine.xml. 
In this case I created "static" folder under war folder. Also, set the expiration date. Using this setting, server will ask the browser to invalidate the cache for every 30 days and 5 hours. 

        <include path="/static/**" expiration="30d 5h" />

Next step is to set this folder as the public root. index.html which is configured in web.xml, will be picked up from this folder. 



The final output can be found at 



Generate token signing .CER from ADFS Federation Metadata XML

June 24, 2015

While workging on SSO, our ADFS team has provided me with federation metadata xml only. As per this link, you also need a token-signing certificate from provider to complete the setup and provide the XML file to ADFS.

Now, the question is how to generate .pem/.cer file out of FederationMetadata.xml file. 

  1. Edit FederationMetadata.xml file, and search for <KeyDescriptor use="signing">. You should find more than one entry. Pick any one of them. 

    2.  Pick the X500Certificate value and save the text as .der file
    3. openssl x509 -in <(base64 --decode FILE_FROM_STEP2.der) -inform DER -out OUTPUT.pem

Use <OUTPUT>.per as Identity Provider Certificate. 



Get Complete Package.xml

June 24, 2015

Below are a few interesting post about preparing package.xml automatically.

There is no way to figure out wild card metadata components (*) using API. Also, there is no way to figure out what components are not supported as part of migration tool. 

Out of above posts, I found Jitendra's post very interesting and useful. Whereas packagebuilder by Tquila guys, generates an exhaustive list of package.xml file for your organization. 

On a side note Tquila guys have build a few interesting heroku apps. - To switch components between orgs. 
 - To compare metadata between orgs.

Watch this space for more info!


Performance tuning of a visualforce page

June 22, 2015

It's my strong belief that a developer should always keep the performance in the top of the priority list. As I was working on a big Visualforce page with lot of aggregations, I wondered how to get best performance out of APEX and VF. 

Both were really helpful, until I ran into following statements. 

Apex Best Practice # 4 states "Use the power of the SOQL where clause to query all data needed in a single query." 

Visualforce guide states

"When writing Apex or SOQL for use with a Visualforce page: 

• Perform calculations in SOQL instead of in Apex, whenever possible.
• Never perform DML inside a loop. 
• Filter in SOQL first, then in Apex, and finally in Visualforce"

Now, I am confused on writing a single SOQL and do the aggregations such as (max,min, sum, avg) in APEX or writing multiple aggregate SOQL's relay the data to VF through APEX. 

I have done a few tests to understand the behaviour, using application performance profiling. 

Note: I used this link to get more insight. But, it's always important to capture right amount of log to troubleshoot the issue. If the log size is bigger than 1MB, some parts may be truncated. This results in incorrect graphs in developer console. Adjust, log levels accordingly to get complete and appropriate log. Also, it's important to leave out first 2-3 logs before you analyze the log as caching plays an important role in reducing turnout times. 

After thorough analysis, VF guide proved right. Ouf of writing multiple aggregate SOQL's  and using Apex collections for aggregation, it's better to choose former approach. The first approach served the request within 50% time of latter. Once cached SOQL's are very quick and Apex processing lags behind (Also don't forget that we have CPU time governor limit to worry about) . Also, It's quick to get the aggregate information from DB to Apex than bring the whole record set for processing. 

Below are few graphs. 

CASE I - Multiple Aggregate SOQL (7ms) 
CASE I - One SOQL - Aggregate using Apex Collections (15ms)

CASE II - Multiple Aggregate SOQL(3ms)

CASE II - One SOQL - Aggregate using Apex Collections (6+ms)

Popular Posts