home about us blogs contacts


This weblog by Helion-Prime Solutions about software design, experience, business, the web, simplicity and more

GitHub good working practice

June 10th, 2013 by vasiliy.kiryanov

GitHub is wonderful place to learn new things and help open source community by participating in various projects. GitHub newcomers often have similar problems with git usage pattern, and I want to describe simple strategy to avoid them.

Work with a fork

The first thing is to fork the repository in your GitHub account. Go thru https://help.github.com/articles/fork-a-repo instructions on how to “Configure Remotes” and “Pull in upstream changes” to keep your fork in sync with changes that happen in the official repository.

Never commit on master branch

Each time you want to commit bug fix or a feature you need to create a branch for it. There is no problem for maintainer of a project to accept your pull request from you master branch but it is problematic for your fork when you want to pull the changes back and your master branch has diverged from upstream.

With this strategy you can think of your master branch as a “landing place” for upstream changes. Even your own commits will end up on master after they have been merged in upstream. Also when you do a pull request on a branch, you can continue to work on another branch and make another pull request.

Before create a new branch pull the changes from upstream, your master need to be uptodate.

$ git fetch upstream
# Pulls in changes not present in your local repository, without modifying
$ git merge upstream/master
# Merges any changes fetched into your working files

Create new branch on your local machine:

$ git branch <name_of_your_branch>

Switch to your new branch :

$ git checkout <name_of_your_branch>

Check current branch you are working on:

$ git branch

Push the branch/changes from your commit on github :

$ git push origin <name_of_your_branch>

Delete a branch on your local filesytem :

$ git branch -d <name_of_your_branch>

Delete the branch on github (you can also make it on github site):

$ git push origin :<name_of_your_branch>

Useful hacks

If you accidentally commit on master but have not pushed your changes:

$ git reset --hard upstream/master
# resets your master branch to the same state as upstream/master.

If you need to merge several commits to one commit:
Say your bug fix branch is called bugfix then on the master branch issue the following command:

$ git merge --squash bugfix
$ git commit
# take all the commits from the bugfix branch, squash them into 1 commit and then merge it with your master branch.

If you need to amend last commit:
Make the fixes. (If you just want to change the log, skip this step.)
Commit the changes in “amend” mode:

$ git commit --all --amend
# your editor will come up asking for a log message (by default, the old log message).
Be Sociable, Share!

Use https securely

June 7th, 2012 by vasiliy.kiryanov


Nowadays everyone understands how easy to hijack unsecured http session. So, https is the key for secure web. Sometime people don’t use https for small project due to price of certificates, http://www.startssl.com/ solves this problem, they give (Class 1) certificates for free, and verified (Class 2) have ridiculous cost. Therefore the only problem is errors of https usage.

keep cookie safe

The cookie can be stolen before redirect to https, to avoid cookie theft you need to use ‘Secure’ flag. It instructs browser to send cookie only thru https connection.

Set-Cookie: mycookie=somevalue; path=/securesite/; Expires=12/12/2345; Secure;

make correct redirect

When certificate is adjusted for usage with application server you need to redirect users from http://mysite.com to https://mysite.com. The redirect opens vulnerability as an attack can be performed before redirect.

The HTTP/1.1 specification (rfc2616) informs us that http responsе codes 301 (“moved permanently”) и 302 (“found”/”moved temporarily”) can be cashed by browser. So by using of Expires or Cache-Control max-age with big expiration dates we can avoid redirects.

Expires: Mon, 01 Jan 2099 00:00:00 GMT

Another idea is usage of Strict-Transport-Security header. It informs browser that website accessible only thru https. All http quires will be rewritten on client-side by browser.

Strict-Transport-Security: max-age=31556926;

It tells browser that support Strict-Transport-Securityto use only https for particular site during 1 year. At this time Firefox and Chrome support it, Opera waits till this standard change type to ‘agreed’ or ‘established’.

don’t mix content

You should ensure that you don’t use content from http sites. Often people forget that they use CDN to load libraries or Google analytics. So check for each http entry on your site and change it to https.

Be Sociable, Share!

nginx setup for t-wiki

March 17th, 2012 by vasiliy.kiryanov


I will start it fast, you know that nginx is cool due to it faster than apache, and you know that t-wiki is good due to it is open-source enterprise wiki used by number of Fortune 500 companies. But you should also know that t-wiki is perl application.

“Perl – The only language that looks the same before and after RSA encryption.”
Keith Bostic

nginx setup

T-wiki developers don’t believe in power of nginx, it’s the only idea that come to my mind when I think why they have number of apache examples and even web-based apache configuration but nothing for nginx. Quick search in google shows that common question is “t-wiki perl scripts don’t have extensions. How to execute them with nginx?”

So, the only way is to read documenation.

list of docs to look into:
http://wiki.nginx.org/Pitfalls (try it for sure)

nginx config file:

server {
    listen  my_ip:80;
    server_name my_servername.com www.my_servername.com;

    access_log /var/log/nginx/my_servername.com.access.log;                                                                        
    location /wiki/pub {                                                                                                        
      root /var/www;                                                                                                  
    location = /wiki/bin/configure {                                                                                            
      root /var/www;                                                                                                          
      allow my_ip;
      deny  all;
      fastcgi_pass unix:/var/run/fcgiwrap.socket;
      include /etc/nginx/fastcgi_params;

   location ~ /wiki/bin/(?<action>[a-z]+)(\/(?<path>.*))?$ {
      root /var/www;

      fastcgi_pass unix:/var/run/fcgiwrap.socket;
      fastcgi_param SCRIPT_FILENAME $document_root/wiki/bin/$action;
      fastcgi_param SCRIPT_NAME $action;

      #identifies the resource to be returned by the CGI script,
      #and is derived from the portion of the URI path hierarchy following
      #the part that identifies the script itself.
      fastcgi_param PATH_INFO /$path;

      #virtual-to-physical translation appropriate to map it onto the
      #server's document repository structure
      fastcgi_param PATH_TRANSLATED $document_root/wiki/bin/$action;

      include /etc/nginx/fastcgi_params;

   location ~ /wiki/(^/lib|^/data|^/locale|^/templates|^/tools|^/work) {
      deny all;

   location = /favicon.ico {
      access_log off;
      log_not_found off;
Be Sociable, Share!

How to finish with differences in renders of HTML in browsers

August 18th, 2011 by vasiliy.kiryanov


When there is no standard there is no same approach for same things and we have chaos.
But we have standards for HTML and CSS, we can find them all on W3C pages http://www.w3.org/MarkUp/ and http://www.w3.org/Style/CSS/.
And still we hear from users “Why does my website look different on different browsers?” or “I want to kill developers of Internet explorer” from web-developers.

complexity of standards

First problem is the complexity of standards that should take into account many different things. It’s hard for developers to understand and develop products appropriately.
As W3C can’t simplify standards it should put special effort to develop and provide special set of test-cases like famous set of Acid tests but it should provides not just set of randomly picked features but complete cover of specifications – XHTML, CSS, DOM, SVG. There will be standard way to test browsers and someday we will finally have same picture on all browsers.

human nature

The problem not only in the standards but in human nature that incite some people to use evolving versions of HTML/CSS to get fantastic features. But I believe when W3C begins to provide tests it will be evident for anyone that developer is guilty for bugs or using of experimental features.

Be Sociable, Share!

Improvement of Google Ads

June 24th, 2011 by vasiliy.kiryanov

We all know that Google generates profit primarily from its advertising programs. So keeping them effective is very important but difficult task as users tend to ignore advertisement and use special plug-ins for browsers to block it.

The key here is to make Ads more attractive using information about users, as people say today make them more social. It’s clear that for Social networks like Facebook it is much easier to collect such data then for search engine that can only remember history of your searches and detect your current location.

the straightforward solution

Ask users to provide data and setup Ads they want to see!
google search with customization feature

After users click “Select Ads you want to see” they see simple Ads dashboard:
google dashboard
When users can select useful content there are no reasons for them to block it or ignore, Google can reduce amount places for Ads (like recent bottom place in Gmail) and decries distraction even more. Additionally Google can add Google offers here and turn Ads into some kind of fun.

Be Sociable, Share!
©2010 Helion-Prime Solutions Ltd.
Custom Software Development Agile Company.