I implemented a C++ pipeline for learning Fisher feature vectors using VLFeat since Matlab should be avoided whenever possible. I managed to find Python bindings later.
The code does the following from a set of labeled images it extracts dense SIFT features. It finds the PCA representation of the SIFT reducing the dimension from 128 to 80 (or whatever dimension you want). It then computes the GMM clustering using the EM algorithm. From the learned the GMM it computes the Fisher vector representation. We can also add additional features from a text file, such as a sub layer from a CNN, like Caffe for example. After computing the feature vectors the code trains a linear SVM, one for each class label. The code saves all representations so that inference can be run on new images.
The code use CMake for compilation and requires the Boost, OpenCV and Eigen libraries in addtion to VLFeat. To train on some subset of images we can run from the terminal
Lets say you have a set of text files that contains column entry data, for example a set of csv files, or text files. Each row in each of the csv files contains data entries relating to the same instances. We would like to concatenate all the csv files into one csv file. We can do this in the terminal using the paste command, like this,
paste file1.csv file2.csv | column > X.txt
If we want to remove the , ; \t or anuy other character that usually separates csv files we can tell column to create a table separated by white space. Suppose the data in our csv files are separated by ; we can do
Somtimes this generates an error since the paste command can only handle 512 character long lines(if I remember it correctly). A better approach is to use this script which I found on StackOverflow
Lets say you have a list of lists containing values that you want to iterate over. The problem is that the number of lists keeps on changing so you can’t really make it into a hard coded set of nested for loops. What you really need is to dynamically create a set of nested for loops. In Python you can use the itertools for these kind of jobs. Extremly handy if you want to do grid search over a set of values for example. In code it might look like this,
# Python3
import numpy as np
from itertools import product
# Create a list of np array values
rangeList = []
rangeList.append(np.linspace(0.,10.,10))
rangeList.append(np.linspace(40.,1000.,10))
rangeList.append(np.linspace(10.,40.,10))
# Iterate over all of the lists values
for vals in product(*rangeList):
for val in vals:
print(val,end=",")
print(" ")