Remote access ipython notebooks

February 18, 2015 1 comment

Original post: https://coderwall.com/p/ohk6cg/remote-access-to-ipython-notebooks-via-ssh

remote$ipython notebook --no-browser --port=8889

local$ssh -N -f -L localhost:8888:localhost:8889 remote_user@remote_host

To close the SSH tunnel on the local machine, look for the process and kill it manually:

local_user@local_host$ ps aux | grep localhost:8889
local_user 18418  0.0  0.0  41488   684 ?        Ss   17:27   0:00 ssh -N -f -L localhost:8888:localhost:8889 remote_user@remote_host
local_user 18424  0.0  0.0  11572   932 pts/6    S+   17:27   0:00 grep localhost:8889

local_user@local_host$ kill -15 18418

Alternatively, you can start the tunnel without the -f option. The process will then remain in the foreground and can be killed with ctrl-c.

On the remote machine, kill the IPython server with ctrl-c ctrl-c.

Note: If you are running GPU & Theano on your remote machine, you can launch the notebook by:

THEANO_FLAGS=mode=FAST_RUN,device=gpu,floatX=float32 ipython notebook –no-browser –port=8889

Another simple way is to do the following (adding ip=*):

# In the remote server

$ THEANO_FLAGS=mode=FAST_RUN,device=gpu,floatX=float32 ipython notebook –no-browser –ip=* –port=7777

then you can reach the notebook from http:// the-ip-address-of-your-remote-server:7777/

Categories: Uncategorized Tags:

Few things when using Eclipse

January 13, 2015 Leave a comment

Workspace is locked.

If you encounter the situation which Eclipse says:

“Could not launch the product because the associated workspace is currently in use by another Eclipse application.” or “Workspace in use or cannot be created, chose a different one.”

Screen Shot 2015-01-13 at 11.38.30 AM

Just delete the .lock file in the .metadata directory in your eclipse workspace directory.

Install Eclipse IDE and Java/C++ development tools on Ubuntu12.04 LTS Precise Pangolin using command line

Original link: http://www.inforbiro.com/blog-eng/ubuntu-12-04-eclipse-installation/
1) Open a terminal and enter the command
sudo apt-get install eclipse-platform
2) After Eclipse is installed you can install development plugins based on your needs, e.g.:
will install Java Development Tools (JDT) package for Eclipse
sudo apt-get install eclipse-jdt
will install C/C++ development tools packages for Eclipse
sudo apt-get install eclipse-cdt

Replace tab with spaces in Eclipse CDT:

Original from here.
For CDT: Go to Window/Preference -> C/C++ -> Code Style -> Formatter -> New (create a new one because the built in profile can not be changed) -> MyProfile (choose one name for the profile) -> Indentation, Tab Policy –> Spaces only

Categories: Tools Tags:

Random notes

July 1, 2014 Leave a comment

If you type relentless.com into browser, you will be re-routed to Amazon. Amazon is introducing their new Fire phone, which includes the OCR technology called firefly to recognize movies, songs, etc. Certainly interesting, but look forward to see how good it may perform and how fast when it finally comes out.

 

 

Categories: MISC

Installing Mercurial on Mac

May 29, 2014 Leave a comment
$ brew install mercurial

If you see errors like:

clang: error: unknown argument: '-mno-fused-madd' [-Wunused-command-line-argument-hard-error-in-future] clang: note: this will be a hard error (cannot be downgraded to a warning) in the future

you can disable the ‘warning’ (which is now showing as error ) by:

$ ARCHFLAGS=-Wno-error=unused-command-line-argument-hard-error-in-future \ 
brew install mercurial

Again, after the install successed, if you see linking error:

Error: Could not symlink file: /usr/local/Cellar/mercurial/2.9/share/man/man5/hgrc.5
/usr/local/share/man/man5 is not writable. You should change its permissions.

You can change the permission. It is said to be safe to change the permission for the whole /usr/local. If you don’t want to do so, just do it for this case

$ sudo chown -R 'your-user-name' /usr/local/share/man/man5
 $ brew link mercurial
Categories: Python, Software

Deep learning on visual recognition task

May 13, 2014 Leave a comment

The current benchmark on visual recognition task:

http://www.csc.kth.se/cvap/cvg/DL/ots/

Categories: Uncategorized Tags:

Install Deepnet on Mac

November 15, 2013 3 comments

This may help to have Nitish’s deepnet work on your mac. The code is very clean, most important thing is to follow the instructions here https://github.com/nitishsrivastava/deepnet/blob/master/INSTALL.txt

(1) DEPENDENCIES

a) You will need Numpy, Scipy installed first, because the tools is largely python. Simply way is to use ‘brew‘. For example, follow the instructions here.

b) CUDA Toolkit and SDK.
Follow the instructions(CUDA5.5):  http://docs.nvidia.com/cuda/cuda-getting-started-guide-for-mac-os-x/
NVIDIA CUDA Toolkit (available at http://developer.nvidia.com/cuda-downloads)

I followed both instruction on http://docs.nvidia.com/cuda/cuda-getting-started-guide-for-mac-os-x/
and instruction from the deepnet to set the system paths:

export PATH=/Developer/NVIDIA/CUDA-5.5/bin:$PATH
export DYLD_LIBRARY_PATH=/Developer/NVIDIA/CUDA-5.5/lib:$DYLD_LIBRARY_PATH

Follow the deepnet instruction: for mac, it is the ‘~.profile’, edit/add to the file:

export CUDA_BIN=/usr/local/cuda-5.0/bin
export CUDA_LIB=/usr/local/cuda-5.0/lib
export PATH=${CUDA_BIN}:$PATH
export LD_LIBRARY_PATH=${CUDA_LIB}:$LD_LIBRARY_PATH

First make sure CUDA installed right:
install the examples: cuda-install-samples-5.5.sh <dir>

and go to /Developer/NVIDIA/CUDA-5.5/samples, choose any simple example subfolder, go into and do ‘make’, after make completed, you can do a simple test.

(c) Protocol Buffers.

Download the file: http://code.google.com/p/protobuf/

Follow the instructions to compile/install it.  It will be install (generally in /usr/local/bin/protoc). It was said that you only need to include the directory that contains ‘proc’, so add to path:
export PATH=$PATH:/usr/local/bin

(2) COMPILING CUDAMAT AND CUDAMAT_CONV

For making the cuda work, do ‘make’ in cudamat , but change all the ‘uint’ to ‘unsigned’ in file: cudamat_conv_kernels.cuh
or do a #define uint unsigned
Then run ‘make’ in cudamat folder

(3,4) STEP 3,4

continue follow step 3, and 4 on https://github.com/nitishsrivastava/deepnet/blob/master/INSTALL.txtand you will get there.

Note (1): I did not install separately for  cudamat library by Vlad Mnih and cuda-convnet library by Alex Krizhevsky.

Note (2): If you do NOT have GPU: another alternative is to not use GPU, most recent mac come with NVIDIA 650, but some old version may use intel graphical card. In that case you can still do the deep learning part, but using eigenmat. The drawback is that it will be very slow. 

Install eigen from here: http://eigen.tuxfamily.org/index.php?title=Main_Page
if given error <Eigen/..> can not found, change to “Eigen/…”
also you need to change python path, including path to where ‘libeigenmat.dylib’ located. It it still fails to find: libeigenmat.dylib. It may not hurt to give it a direct path, edit the file <eigenmat/eigenmat.py>.
_eigenmat = ct.cdll.LoadLibrary(‘the-path-to/libeigenmat.dylib’)

Rectifier Nonlinearities

November 6, 2013 Leave a comment

There are multiple different choice of activation functions for a NN. Many work has shown that using Rectified linear unit (ReLU) helps improve discriminative performance.

The figure below shows few popular activation functions, including sigmoid, and tanh.

activation_funcs

sigmoid:       g(x) = 1 /(1+exp(-1)). The derivative of sigmoid function g'(x) = (1-g(x))g(x).

tanh :              g(x) = sinh(x)/cosh(x) = ( exp(x)- exp(-x) ) / ( exp(x) + exp(-x) )

Rectifier (hard ReLU) is really a max function

g(x)=max(0,x)

Another version is Noise ReLU max(0, x+N(0, σ(x)). ReLU can be approximated by a so called softplus function (for which the derivative is the logistic functions):

g(x) = log(1+exp(x))

The derivative of hard ReLU is constant over two ranges x<0 and x>=0, for x>0, g’=1, and x<0, g’=0.

This recent icml paper has discussed the possible reasons that why ReLU sometimes outperform sigmoid function:

  • Hard ReLU is naturally enforcing sparsity.
  • The derivative of ReLU is constant, as compared to sigmoid function, for which the derivative dies out if we either increase x or decrease x.
Categories: Machine Learning
Follow

Get every new post delivered to your Inbox.

Join 126 other followers