Reputation: 161
I am working on a application which is reading a lot of data with parallel threads into one Table.
I'm using Microsoft's SQL Server 2014 with one memory optimized Table and a native compiled stored procedure for writing.
My application reads tons of data (Virtual Trades) into this memory optimized Table :
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Trades]
(
[asset_id] [bigint] NOT NULL,
[trade_id] [bigint] NOT NULL,
[BNP] [money] NOT NULL,
[LSP] [money] NULL,
[expires] [int] NULL,
[timestamp] [bigint] NULL,
[currentBid] [money] NULL,
[startingBid] [money] NULL,
[pos] [nchar](3) COLLATE Latin1_General_BIN2 NOT NULL,
[rating] [int] NOT NULL,
INDEX [asset] NONCLUSTERED HASH
(
[asset_id],
[BNP]
)WITH ( BUCKET_COUNT = 32),
CONSTRAINT [Trades_primaryKey] PRIMARY KEY NONCLUSTERED HASH
(
[trade_id]
)WITH ( BUCKET_COUNT = 64)
)WITH ( MEMORY_OPTIMIZED = ON , DURABILITY = SCHEMA_AND_DATA )
GO
The Stored Procedure calculates at every call an average price for the column BNP
for all rows with the same asset_id
to find out if the current trade is a very cheap one :
CREATE PROCEDURE [dbo].[ADD_TRADE_]
@asset int, -- asset_id
@trade nchar(12), -- trade_id
@BNP int, -- BuyNowPrice
@LSP int, -- LastSellPrice
@expires nchar(16), -- expiration in seconds
@timestamp bigint, -- current timestamp
@currentBid int, -- current Bid on trade
@startingBid int, -- starting Bid on trade
@pos nchar(3), -- Player Position
@rating int, -- Card Rating
@MIN_TRADE_CNT int,
@BUY_PERCENTAGE int,
@AVG_BNP int
WITH NATIVE_COMPILATION, SCHEMABINDING, EXECUTE AS OWNER
AS BEGIN ATOMIC WITH
(
TRANSACTION ISOLATION LEVEL = SNAPSHOT, LANGUAGE = N'us_english'
)
--CONFIG
/* RATING TO COLOR
DECLARE @BRONZE int = 64
DECLARE @SILVER int = 74
*/
/* 1 //// INSERT FRESH TRADES ////////////////////// */
--TMP VARS
DECLARE @trade_cnt int = 0
DECLARE @exists bit = 0
--CHECK IF TRADE EXISTS
SELECT @exists=1
FROM [dbo].[Trades]
WHERE trade_id = @trade
-- ADD IF NOT
IF @exists=0
BEGIN
INSERT INTO [dbo].[Trades]
(asset_id,trade_id,BNP,LSP,expires,timestamp,currentBid,startingBid,pos,rating)
VALUES(@asset, @trade,@BNP,@LSP,@expires,@timestamp,@currentBid,@startingBid,@pos,@rating)
//// CHECK IF TRADE IS A GOOD ONE TO BUY ////////////////////
DECLARE @MIN_BID_PRICE money = 0
SELECT @trade_cnt=COUNT(*) FROM dbo.Trades;
SET @MIN_BID_PRICE = (@AVG_BNP/100) * @BUY_PERCENTAGE
/// IF YES , ADD TO BID TABLE
IF @trade_cnt >= @MIN_TRADE_CNT AND @BNP > 0 AND @BNP < @MIN_BID_PRICE
BEGIN
INSERT INTO [dbo].[Bids](trade_id,bid,max) VALUES(@trade,@BNP,@MIN_BID_PRICE)
END
END
END
So it not only writes parallel, it also reads parallel.
Now my problem is that some writes get blocked, and i get this error:
A previous transaction that the current transaction took a dependency on has aborted, and the current transaction can no longer commit.
If I handle this error and use a retry logic, some of these retries still get blocked up to 10 times and this don't only blocks one specific the thread of the worker it builds up a big stack of threads waiting for their turn to write.
I searched and found a page of msdn where their describe the isolation levels
And I think read committed
is what I need.
But it seems like I can't use that in a memory optimized table.
What can I change in my code or can I use an alternative DB System for this?
Upvotes: 0
Views: 102
Reputation: 909
How're you getting your data into your stored procedure? If it's via a .NET application you'd probably be better off creating a DataTable
in memory for all your columns - including the calculated column and then using the SqlBulkCopy
class to write all of that data in one go. if you've got too much, you could always break it down into batches.
It's a lot more efficient that trying to write one line at a time!
Upvotes: 1