How to make AutoAligner work in OSX

I’m using AutoAligner (GitHub) to do some alignment things in Sublime. However, after installing the default key (ctrl+k, ctrl+a) doesn’t work.

It turns out that the repository doesn’t have the file Default (OSX).sublime-keymap

You can add this file by the following:

  1. Preferences -> Browse Package
  2. Duplicate the file `Default (Linux).sublime-keymap` and rename to Default (OSX).sublime-keymap



book = 'A more beautiful question'
author = 'Warren Berger'

Select those 2 lines in Sublime, then press ctrl+k, followed by ctrl+a

book   = 'A more beautiful question'
author = 'Warren Berger'

Missing “=” sign

I was struggling with this piece of code. I have one similar piece of code, and it was running perfectly. I have to go back and forth, delete gradually pieces and pieces of code to compare between the two.

Turn out that the mistake is really really small, as shown in line 38.


The problem I got:

Name: org.apache.spark.SparkException
Message: Job aborted due to stage failure: Task 0 in stage 365.0 failed 4 times, most recent failure: Lost task 0.3 in stage 365.0 (TID 12524, java.lang.ClassCastException

Continue reading “Missing “=” sign”

Hive on Spark is not working

Problem: in Hive CLI, the simple command doesn’t return a result.

Solution: make sure you have at least one worker (or slave) for Spark Master

hive> select count(*) from subset1_data_stream_with_cgi;

Status: Running (Hive on Spark job[0])
Job Progress Format
CurrentTime StageId_StageAttemptId: SucceededTasksCount(+RunningTasksCount-FailedTasksCount)/TotalTasksCount [StageCost]
2016-06-30 15:09:54,526    Stage-0_0: 0/1    Stage-1_0: 0/1
2016-06-30 15:09:57,545    Stage-0_0: 0/1    Stage-1_0: 0/1
2016-06-30 15:10:00,561    Stage-0_0: 0/1    Stage-1_0: 0/1

Continue reading “Hive on Spark is not working”

Hive CLI doesn’t start

Problem: Hive CLI turned off suddenly, and I cannot start Hive CLI again

Error message:

java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=/mnt/storage/DATA/hadoop/metastore_db;create=true, username = APP

Diagnosis: since Derby database allow only 1 connection to its database, it creates a *.lck in the folder databaseName above. So to this folder, and delete those *.lck file.

After I deleted dbex.lck and db.lck, then hive can start as usual.