Dringo
Dringo

Reputation: 265

Computing edit distance of DNA sequence python

So I am given the task of aligning the lowest cost between 2 DNA sequences. One of the failing inputs is:

CGCAATTCTGAAGCGCTGGGGAAGACGGGT & TATCCCATCGAACGCCTATTCTAGGAT 

The proper alignment costs 24, but I am getting a cost of 23.

I have to read the cost of switching bases say an A -> T, G -> C etc, etc.... The cost file I have been given is:

*,-,A,T,G,C
-,0,1,2,1,3
A,1,0,1,5,1
T,2,1,0,9,1
G,1,5,9,0,1
C,3,1,1,1,0

I have made a python dictionary that links all bases perfectly that looks like:

{'AT': '1', '-C': '3', 'TG': '9', '-G': '1', 'AC': '1', 'C-': '3', 'CA': '1', 'AA': '0', '--': '0', 'TA': '1', 'T-': '2', 'CG': '1', '-T': '2', 'CC': '0', 'GG': '0', 'A-': '1', 'CT': '1', 'AG': '5', 'GC': '1', 'GT': '9', 'GA': '5', 'G-': '1', '-A': '1', 'TC': '1', 'TT': '0'}

Currently my implementation works for some cases, and in other cases I am off by +/- 1.

Here is a snippet of my code:

def align(one_Line, costBook):
    Split_Line = one_Line.split(",")
    array = [[0] * len(Split_Line[1]) for i in Split_Line[0]]  # Zero fill array,

    xLine = Split_Line[0]
    yLine = Split_Line[1]

    for i in range(1, len(xLine)):
        array[i][0] = array[i - 1][0] + int(costBook['-' + xLine[i - 1]])
    for i in range(1, len(yLine)):
        array[0][i] = array[0][i - 1] + int(costBook[yLine[i - 1] + '-'])

    for i in range(1, len(xLine)):
        for j in range(1, len(yLine)):
            array[i][j] = min(array[i - 1][j] + diff('-', xLine[i], costBook), 
                              array[i][j - 1] + diff(yLine[j], '-', costBook), 
                              array[i - 1][j - 1] + diff(yLine[j], xLine[i], costBook))

Where the diff function is:

def diff(x, y, cost):
    if x == y:
        return 0
    keyStr = x + y
    return int(costBook[keyStr])

Where am I going wrong? Is it in the actual array filling itself, or am I doing the base cases wrong?

EDIT: Here is a semi-working version, atleast the edit cost is right:

AGTTGTGAAAGAACAAGCGCACAATATTGCCGCGCCGAAAGCT,TTCTTTCATTATTCAA‌​ATGTATAGTTTAGAGCGTTA‌​A

Upvotes: 4

Views: 1822

Answers (1)

nbryans
nbryans

Reputation: 1527

I believe the reason that your algorithm is failing is that you are failing to account for the "gap" row in your array (scoreing matrix).

Consider two sequences A and B, each having length n. If we look at the wikipedia articles for the Needleman-Wunsch and Smith-Waterman algorithms, we see that in their respective "scoring" matrices there is an extra row at the beginning. This is to represent either the first character of A or B being paired with a gap. I recommend you quickly review those pages to see what I mean

When you define your array as:

array = [[0] * len(Split_Line[1]) for i in Split_Line[0]]

You are not including this additional row.

You would need modify the align function to add this extra row, and alter logic for calculating the score. i.e:

def align(one_Line, costBook):
    Split_Line = one_Line.split(",")

    # Defining an array with an extra row and col to represent a leading gap
    array = [[0] * (len(Split_Line[1])+1) for i in range(len(Split_Line[0])+1)]  # Zero fill array,

    xLine = Split_Line[0]
    yLine = Split_Line[1]

    # Changed so we include our extra line in the loop
    for i in range(1, len(xLine)+1):
        array[i][0] = array[i - 1][0] + int(costBook['-' + xLine[i - 1]])
    for i in range(1, len(yLine)+1):
        array[0][i] = array[0][i - 1] + int(costBook[yLine[i - 1] + '-'])

    # Changed so we include our extra row/col in the loop
    for i in range(1, len(xLine)+1):
        for j in range(1, len(yLine)+1):
            # The references to the original string now need -1 (i.e. i-1)
            array[i][j] = min(array[i - 1][j] + diff('-', xLine[i-1], costBook), 
                              array[i][j - 1] + diff(yLine[j-1], '-', costBook), 
                              array[i - 1][j - 1] + diff(yLine[j-1], xLine[i-1], costBook))
    return array

Upvotes: 4

Related Questions