home about us blogs contacts

Archive for the ‘web-development’ Category

Use https securely

Thursday, June 7th, 2012


Nowadays everyone understands how easy to hijack unsecured http session. So, https is the key for secure web. Sometime people don’t use https for small project due to price of certificates, http://www.startssl.com/ solves this problem, they give (Class 1) certificates for free, and verified (Class 2) have ridiculous cost. Therefore the only problem is errors of https usage.

keep cookie safe

The cookie can be stolen before redirect to https, to avoid cookie theft you need to use ‘Secure’ flag. It instructs browser to send cookie only thru https connection.

Set-Cookie: mycookie=somevalue; path=/securesite/; Expires=12/12/2345; Secure;

make correct redirect

When certificate is adjusted for usage with application server you need to redirect users from http://mysite.com to https://mysite.com. The redirect opens vulnerability as an attack can be performed before redirect.

The HTTP/1.1 specification (rfc2616) informs us that http responsе codes 301 (“moved permanently”) и 302 (“found”/”moved temporarily”) can be cashed by browser. So by using of Expires or Cache-Control max-age with big expiration dates we can avoid redirects.

Expires: Mon, 01 Jan 2099 00:00:00 GMT

Another idea is usage of Strict-Transport-Security header. It informs browser that website accessible only thru https. All http quires will be rewritten on client-side by browser.

Strict-Transport-Security: max-age=31556926;

It tells browser that support Strict-Transport-Securityto use only https for particular site during 1 year. At this time Firefox and Chrome support it, Opera waits till this standard change type to ‘agreed’ or ‘established’.

don’t mix content

You should ensure that you don’t use content from http sites. Often people forget that they use CDN to load libraries or Google analytics. So check for each http entry on your site and change it to https.

nginx setup for t-wiki

Saturday, March 17th, 2012


I will start it fast, you know that nginx is cool due to it faster than apache, and you know that t-wiki is good due to it is open-source enterprise wiki used by number of Fortune 500 companies. But you should also know that t-wiki is perl application.

“Perl – The only language that looks the same before and after RSA encryption.”
Keith Bostic

nginx setup

T-wiki developers don’t believe in power of nginx, it’s the only idea that come to my mind when I think why they have number of apache examples and even web-based apache configuration but nothing for nginx. Quick search in google shows that common question is “t-wiki perl scripts don’t have extensions. How to execute them with nginx?”

So, the only way is to read documenation.

list of docs to look into:
http://wiki.nginx.org/Pitfalls (try it for sure)

nginx config file:

server {
    listen  my_ip:80;
    server_name my_servername.com www.my_servername.com;

    access_log /var/log/nginx/my_servername.com.access.log;                                                                        
    location /wiki/pub {                                                                                                        
      root /var/www;                                                                                                  
    location = /wiki/bin/configure {                                                                                            
      root /var/www;                                                                                                          
      allow my_ip;
      deny  all;
      fastcgi_pass unix:/var/run/fcgiwrap.socket;
      include /etc/nginx/fastcgi_params;

   location ~ /wiki/bin/(?<action>[a-z]+)(\/(?<path>.*))?$ {
      root /var/www;

      fastcgi_pass unix:/var/run/fcgiwrap.socket;
      fastcgi_param SCRIPT_FILENAME $document_root/wiki/bin/$action;
      fastcgi_param SCRIPT_NAME $action;

      #identifies the resource to be returned by the CGI script,
      #and is derived from the portion of the URI path hierarchy following
      #the part that identifies the script itself.
      fastcgi_param PATH_INFO /$path;

      #virtual-to-physical translation appropriate to map it onto the
      #server's document repository structure
      fastcgi_param PATH_TRANSLATED $document_root/wiki/bin/$action;

      include /etc/nginx/fastcgi_params;

   location ~ /wiki/(^/lib|^/data|^/locale|^/templates|^/tools|^/work) {
      deny all;

   location = /favicon.ico {
      access_log off;
      log_not_found off;

How to finish with differences in renders of HTML in browsers

Thursday, August 18th, 2011


When there is no standard there is no same approach for same things and we have chaos.
But we have standards for HTML and CSS, we can find them all on W3C pages http://www.w3.org/MarkUp/ and http://www.w3.org/Style/CSS/.
And still we hear from users “Why does my website look different on different browsers?” or “I want to kill developers of Internet explorer” from web-developers.

complexity of standards

First problem is the complexity of standards that should take into account many different things. It’s hard for developers to understand and develop products appropriately.
As W3C can’t simplify standards it should put special effort to develop and provide special set of test-cases like famous set of Acid tests but it should provides not just set of randomly picked features but complete cover of specifications – XHTML, CSS, DOM, SVG. There will be standard way to test browsers and someday we will finally have same picture on all browsers.

human nature

The problem not only in the standards but in human nature that incite some people to use evolving versions of HTML/CSS to get fantastic features. But I believe when W3C begins to provide tests it will be evident for anyone that developer is guilty for bugs or using of experimental features.

Don’t use Google recaptcha

Friday, June 10th, 2011


Google reCAPTCHA is a great idea originally developed at Carnegie Mellon University by Guatemalan computer scientist Luis von Ahn. It uses captcha to help digitize the text of books while protecting websites from bots. According to Google reports it displays over 100 million captchas every day. Among its subscribers are such popular sites as: Facebook, Twitter, CNN.com, and StumbleUpon.

main drawback

So, main drawback is complexity of captchas. Captchas are getting more and more complex or even unreal to deal with. Just check twitter with query like “recaptcha” and you’ll see amazing amount of people that wonder what is going on.

number to think about

A number is 14%. According to my research on 2 our sites: http://prices.by and http://cartenergy.ru we were loosing about 14% of users on services sign-up while using REcaptcha.
The test was conducted using A/B testing where I vary our captcha and Google REcaptcha on sign-up page.

hint to google

Provide some parameter to select level of hardness, use of native languages is also a way to simplify solving while keep great security.

WebP – 39% more compression than JPEG

Wednesday, June 1st, 2011

WebP is a lossy compression method proposed by Google. The degree of compression is adjustable so a user can choose between file size and image quality. WebP typically achieves an average of 39% more compression than JPEG without loss of image quality.

You can check gallery that compares JPEG and WEBP (The WebP images are more than 30% smaller than the JPEG ones): http://code.google.com/speed/webp/gallery.html. The only problem with this method is bad browser support. At this time it’s just Google Chrome 9+ and Opera 11.10 beta.

You can create WebP images in ImageMagick, and XnConvert. You can also use WebP command line utility to convert.

Find more information about WebP: http://code.google.com/speed/webp/

©Helion-Prime Solutions Ltd.
Custom Software Development Agile Company.