SAMKHYA (सांख्य)

HIGH PERFORMANCE COMPUTING FACILITY


Overview

Samkhya (सांख्य), also referred to as Sankhya, Sāṃkhya, or Sāṅkhya, is a Sanskrit word that, depending on the context, means "to reckon, count, enumerate, calculate, deliberate, reason, reasoning by numeric enumeration, relating to number, rational.[1]"

"SAMKHYA - High Performance Computing (HPC) Facility at IOP is a 222 TeraFLOPS hybrid environment which consists of CPU, NVIDIA Tesla GPU, Intel Xeon-Phi, QDR Infiniband interconnect and 7.5 TB of memory.

Samkhya has been featured in Top Supercomputers-India list since January 2018.
Link: January 2018 report



System overview:-

  • 1440 Intel Haswell Processors
  • 7.5 TB Memory
  • 40 NVIDIA Tesla K80 cards
  • 40 Intel Xeon Phi 7120P
  • 50 TB of Object Storage
  • Infiniband QDR Interconnect

References:
[1] Wikipedia

Accessing the Facility

Users should use a SSH client or putty to log in to our machines. Unencrypted methods such as telnet, rlogin, and XDM are NOT allowed for accessing our machines. Any SSH client will typically require the user to supply the name or IP address of the machine to which access is sought, as well as a username and a password, before granting access. The interface may differ from platform to platform (PC based clients typically have GUIs while Unix based clients may not).

You should also get into the habit of using secure copy (SCP -- a companion program often bundled with ssh) instead of the traditional ftp utility to transfer files. SCP is more flexible than ftp in that it allows you to transfer directories from one machine to another in addition to just files. There are SCP graphical user interfaces for Windows and for Macs.

If you wish to run programs with graphical interfaces on one of our machines and have it display on your workstation, you will need to have a X11 server or X11 server emulator running on your workstation.


Instructions for using SSH with our systems are listed below for each operating system:

SAMKHYA USERS


Windows Users:
  1. 1. Please install SSH client or putty.
  2. 2. Click on the SSH secure shell client icon.
  3. 3. Please click the Quick Connect Button.
  4. 4. Please enter the following details: Hostname: samkhyassh.iopb.res.in
  5. 5. Username: [your user account]. Click on connect, it will prompt for password, and then enter your respective password.

Linux Users:
  1. 1. Open the Terminal, and type ssh [your account name]@samkhyassh.iopb.res.in.
  2. 2. It will prompt for password, enter your respective password.
  3. 3. After successful login, You are on to your home directory.

Getting an Account

Kindly fill up the form available here to get an account.

Nodes Information

Head (Master) Node

It is head node used for login and job submission purpose, this node should not be used for job Execution purpose. Our HPC facility consist two master node for high availability.

To get IP Address/ Hostname to Samkhya HPC, please contact HPC Support.


Compute (Slave) Node

HPC facility consist 60 compute nodes, is a hybrid environment containing CPU, Nvidia GPU and Xeon Phi.

Sl. No. Nodes Type Cores/ Nodes
1. node1-node20 CPU Only 480 Cores (24 Core/Node)
2. node21-node40 Nvidia GPU and CPU 480 Cores (24 Core/Node)
40 Cards (2 Per Node)
3. node41-node60 Intel Xeon Phi and CPU 480 Cores (24 Core/Node)
40 Cards (2 Per Node)
Total 1440 CPU Cores
40 Nvidia Cards
40 Xeon Phi Cards

Application

Application name and version Directory
Cython 0.26/data/software/cython/bin
Fftw 3.3.7/data/software/fftw337/bin
Gcc
Available versions: 4.8.5, 6.2, 7.2
Gcc 4.8.5 { default } - /data/software/gcc485/bin
module load gcc62 - /data/software/gcc62/bin
module load gcc72 - /data/software/gcc72/bin
Globes 3.2.16 /data/software/globes3216/bin
module load globes3216
Grace 5.1.25/data/software/grace5125/grace/bin
Gsl 1.15/data/software/gsl115/bin
Hdf5-1.8.20/data/software/hdf5/bin
Lapack 3.7.1 /data/software/lapack-3.7.1/liblapack.a
/data/software/lapack-3.7.1/librefblas.a
Lhapdf 5.9.1/data/software/lhapdf-5.9.1/bin
NuSQUID-master/data/software/nuSQuIDS-master/bin
Squid -master/data/software/squids/lib
Ocaml 4.05.0/data/software/ocaml/bin
opam/data/software/opam/bin
Pgi 18.4/data/software/pgi/linux86-64/18.4/bin
Pvm 3.4.5/data/software/pvm/pvm3/bin
Pythia 8180/data/software/pythia/bin
root_v5.34.36/data/software/root/bin
root_v6.14/data/software/root-6.14.00/obj/bin
Vasp 5.3/data/software/vasp
Vmd 1.9.3/data/software/vmd-1.9.3/bin
Gnuplot 5.0.1/data/software/gnuplot501/bin
Mathematica_11.2module load mathematica_11.2
Mathematica_11.3module load mathematica_11.3
Matlab v8.1.0.604/data/software/Matlab8/bin

Network

  • 1 Gbps LAN connectivity
  • QDR 40 Gbps Infiniband Interconnect

Job Scheduler

The job can be submitted to compute nodes through resource manager called Torque and job scheduler is PBS.

Scheduler commands:

Command Use and Output Remarks
qsub <submission_script> Job submission with Job ID This command will throw Job ID
qstat Status of jobs Option “-f” for full display
pbsnodes -a Show status of all compute nodes
pbsnodes -ln Show down nodes
qstat -B Show Queue status and jobs

Storage

Storage space of 500 GB per user is available to each user with total storage capacity of 50 TB.

There is no data backup mechanism as of now, users are advised to take backup of their important files in storage other than HPC facility.

Submitting Jobs to the Cluster using the scheduler

[A] For Jobs Using only CPUS

Sample serial job script

#!/bin/bash
#PBS -o $PBS_JOBID.out
#PBS -e $PBS_JOBID.err
#PBS -q small
#PBS -l nodes=1:ppn=1
cd $PBS_O_WORKDIR
icc/gcc input_file
./a.out

Save the file as script.cmd, keep your input file and this script file in the same directory and do “qsub script.cmd” Every time you submit a new job kindly use a new directory, do not submit multiple jobs from the same directory.

[B]Sample job script (Openmp jobs/single node jobs)

#!/bin/bash
#PBS -o $PBS_JOBID.out
#PBS -e $PBS_JOBID.err
#PBS -q small
#PBS -l nodes=1:ppn=24
cd $PBS_O_WORKDIR
mpicc input_file
mpirun –np 24 –machinefile $PBS_NODEFILE ./a.out

Save the file as script.cmd, keep your input file and this script file in the same directory and do “qsub script.cmd” Every time you submit a new job kindly use a new directory, do not submit multiple jobs from the same directory.

[C]Sample job script (Parallel jobs/multi node jobs)

#!/bin/bash
#PBS -o $PBS_JOBID.out
#PBS -e $PBS_JOBID.err
#PBS -q medium
#PBS -l nodes=4:ppn=24
cd $PBS_O_WORKDIR
mpicc input_file
mpirun –np 96 –machinefile $PBS_NODEFILE ./a.out

Save the file as script.cmd, keep your input file and this script file in the same directory and do “qsub script.cmd” Every time you submit a new job kindly use a new directory, do not submit multiple jobs from the same directory.

[D]For Jobs Using only GPUS

Sample job script (GPU based jobs)

#!/bin/bash
#PBS -o $PBS_JOBID.out
#PBS -e $PBS_JOBID.err
#PBS -q gpu
#PBS -l nodes=1:ppn=2:gpu
cd $PBS_O_WORKDIR
nvcc input_file
./a.out

Save the file as script.cmd, keep your input file and this script file in the same directory and do “qsub script.cmd” Every time you submit a new job kindly use a new directory, do not submit multiple jobs from the same directory.

[E]To submit Jobs to the phi nodes

#!/bin/bash
#PBS -o $PBS_JOBID.out
#PBS -e $PBS_JOBID.err
#PBS -q phi
#PBS -l nodes=1:ppn=2:phi
cd $PBS_O_WORKDIR
icc input_file
./a.out

Save the file as script.cmd, keep your input file and this script file in the same directory and do “qsub script.cmd” Every time you submit a new job kindly use a new directory, do not submit multiple jobs from the same directory.

Changing a password

To change your password, Once you are logged in to the cluster, type the command passwd and respond to the prompts by entering your old password and your new password. Please remember that this will change your password.

Tutorials

  • Step-1:- User login it this process;
        ssh -X account_name@10.0.0.27
        And after that give password.
  • Step-2:- [user@konark ~] $ module load gcc72 {because gcc7.2 version is compatible}
  • Step-3:- [user@konark ~] $ root -l
        {Run the command}
  • Step-1:- User login it this process;
        ssh -X account_name@10.0.0.27
        And after that give password.
  • Step-2:- [user@konark ~] $ matlab
        {Run the command}
  • Step-1:- User login it this process;
        ssh -X account_name@10.0.0.27
        And after that give password.
  • Step-2:- [user@konark ~] $ module load mathematica_11.2
  • Step-2:- [user@konark ~] $ mathematica
        {Run the command}

Publications

1. "Cross-linker mediated compaction and local morphologies in a model chromosome", Amit Kumar and Debasish Chaudhuri, Journal of Physics: Condensed Matter (2019)

2. "Re-entrant phase separation in nematically aligning active polar particles", Biplab Bhattacherjee and Debasish Chaudhuri, Soft Matter (2019)

3. "Dipole–dipole interaction induced phases in hydrogen-bonded squaric acid crystal", Vikas Vijigiri and Saptarshi Mandal

4. "Zeros of partition function for continuous phase transitions using cumulants". D. Majumdar and S. M. Bhattacharjee. Physica A

5. "Active Brownian particles: mapping to equilibrium polymers and exact computation of moments" Shee, A., Dhar, A. & Chaudhuri, D. Soft Matter 16, 4776–4787 (2020)

6. "Confinement and crowding control the morphology and dynamics of a model bacterial chromosome". Swain, P., Mulder, B. M. & Chaudhuri, D. Soft Matter 15, 2677–2687 (2019).

7. "Impact of crowders on the morphology of bacterial chromosome". Kumar, A., Swain, P., Mulder, B. M. & Chaudhuri, D. EPL (Europhysics Lett. 128, 68003 (2019).

8. "Higher order topological insulator via periodic driving” Arnob Kumar Ghosh, Ganesh C. Paul, and Arijit Saha.

9. "Enhancing sensitivity to non-standard neutrino interactions at INO combining muon and hadron information” Amina Khatun, Sabya Sachi Chatterjee, Tarak Thakore & Sanjib Kumar Agarwalla

10. "Floquet generation of a second-order topological superconductor" (Phys. Rev. B 103, 045424) by Arnob Kumar Ghosh,Tanay Nag and Arijit Saha

11.  "Probing the Type-II Seesaw Mechanism through the Production of Higgs Bosons at a Lepton Collider" Pankaj Agrawal, Manimala Mitra, Saurabh Niyogi, Sujay Shil, and Michael Spannowsky (Phys. Rev.D98(2018)no. 1,015024)

12.  "Fat Jet Signature of a Heavy Neutrino at Lepton Collider" Sabyasachi Chakraborty, Manimala Mitra, and Sujay Shil (Phys. Rev. D100 (2019) no.1, 015012)

13.  "Same-Sign Tetralepton Signature at Large Hadron Collider, and future pp Collider" Eung Jin Chun, Sarif Khan, Sanjoy Mandal, Manimala Mitra, and Sujay Shil (Phys. Rev. D101 (2020) no.7, 075008)

14.  "Automated predictions from polarized matrix elements" Diogo Buarque Franzosi, Olivier Mattelaer, Richard Ruiz, and Sujay Shil (JHEP 2004(2020) 082 )

15.  "Asymmetric Dark Matter From Triplet Scalar Leptogenesis" Nimmala Narendra, Narendra Sahu, Narendra Sahu, and Sujay Shil

16. " Synthesis And Properties of lead-free formamidinium bismuth bromide perovskites by Manav R. Kar, Mihir R. Sahoo, Saroj K. Nayak and Saiakt Bhaumik (journal of Material Today chemistry) "

17.  "Microscopic study of structure of light- and medium-mass even-even cadmium isotopes" by Shivali Sharma, Rani Devi, and S. K. Khosa

18.  "Minimal and nonminimal universal extra dimension models in the light of LHC data at 13 TeV" by Avnish, Kirtiman Ghosh, Tapoja Jha and Saurabh Niyogi

19.  "Relativistic freeze-in with scalar dark matter in a gauged B − L model and electroweak symmetry breaking" by Priyotosh Bandyopadhyay, Manimala Mitra and Abhishek Roy

20.  "Validating the Earth's Core using Atmosperic Neutriones with ICAL at INO" by Anil Kumar and Sanjib Kumar Agrawalla

21.  "Floquet second order topological superconductor based on unconventional pairing" by Arnob Kumar Ghosh, Tanay Nag, and Arijit Saha (Phys. Rev. B 103, 085413 (2021))

22.  "Hierarchy of higher-order topological superconductors in three dimensions" by Arnob Kumar Ghosh, Tanay Nag, and Arijit Saha (Phys. Rev. B 104, 134508 (2021))

23.  " Type-III see-saw: Search for triplet fermions in final states with multiple leptons and fat-jets at 13 TeV LHC" by Saiyad Ashanujjaman, Kirtiman Ghosh (Physics Letters B)

24.  "Revisiting type-II see-saw: present limits and future prospects at LHC" by Saiyad Ashanujjaman and Kirtiman Ghosh (JHEP 03)

25.  "Exploring the Violation of Lorentz Invariance using Atmospheric Neutrinos at INO-ICAL" by Sadashiv Sahoo, Anil Kumar and Sanjib Kumar Agarwalla

26.  "Probing Lorentz Invariance Violation with atmospheric neutrinos at INO-ICAL" by Sadashiv Sahoo, Anil Kumar and Sanjib Kumar Agarwalla

27.  "Systematic generation of the cascade of anomalous dynamical first- and higher-order modes in Floquet topological insulators" by Arnob Kumar Ghosh, Tanay Nag and Arijit Saha

28.  "Dynamical construction of quadrupolar and octupolar topological superconductors" by Arnob Kumar Ghosh, Tanay Nag and Arijit Saha

29.  "A close look on 2-3 mixing angle with DUNE in light of current neutrino oscillation data" by Sanjib Kumar Agarwalla, Ritam Kundu, Suprabh Prakash and Masoom Singh

30.  "Search for exotic leptons in final states with two or three leptons and fat-jets at 13 TeV LHC" by Saiyad Ashanujjaman, Debajyoti Choudhuryc and Kirtiman Ghosh

31 .  "Model-independent constraints on non-unitary neutrino mixing from high-precision long-baseline experiments" by Sanjib Kumar Agarwalla, Sudipta Das, Alessio Giarnettie, and Davide Melonie

32 .  "Functional Pyromellitic Diimide as a Corrosion Inhibitor for Galvanized Steel: An Atomic-Scale Engineering" by Anoop Kumar Kushwaha, Mihir Ranjan Sahoo, Mausumi Ray, Debashish Das, Suryakanta Nayak, Apurba Maity, Kuntal Sarkar, Amar Nath Bhagat, Atanu Ranjan Pal, Tapan Kumar Rout, and Saroj Kumar Nayak