Reputation: 386
I have created a web job in azure. The web job is a type of run continuously.The web job function invoked when there will be a message in a queue. I am able to invoke web job function by adding a message in queue by either by azure storage explorer or by mvc web app.
The web job is a console application which is taking a time around one hour to perform a work by running with command line locally. Upon invoking the web job its started successfully but after some time (around 5-10 minutes), I am finding the function Invoke status as "Never Finished" in web job log.
So my question are following:-
1) Is this problem due to long running task.
2) Is this problem due to any error during processing.(But I can run it locally)
3) If I delete the record from database added by web job then I find that web again started. Why?
4)If I need to delete the message from queue after completion of process?
Here is my snippet of code which is invoked from web job
namespace Scheduler
{
class Program
{
static void Main()
{
JobHost host = new JobHost();
host.RunAndBlock();
}
public static void ProcessQueueMessage([QueueTrigger("webjobschedularargs")] string schedularargs,
[Blob("containername/blobname")]TextWriter writer)
{
//writer.WriteLine(inputText);
string[] args = schedularargs.Split(new char[0]);
RunProcess(args);
writer.WriteLine(schedularargs);
}
private static void RunProcess(string[] args)
{
if (!Initialize())
{
// Log issue
Console.WriteLine("Error!!! Unable to initialize.");
Console.ReadKey(true);
// We're done
return;
}
#region Run processor
var options = new Options();
var timer = new Stopwatch();
if (CommandLine.Parser.Default.ParseArguments(args, options))
{
Console.WriteLine("Processing: ");
timer.Start();
if (options.Profiles != null)
{
foreach (var profile in options.Profiles)
{
Console.Write(profile + ", ");
}
Console.WriteLine();
}
if (options.Reports != null)
{
foreach (var report in options.Reports)
{
Console.Write(report + ", ");
}
Console.WriteLine();
}
var processor = new Processor(options);
processor.Start();
}
#endregion
// Log reason why not valid command args
//-----call run processor function
timer.Stop();
Console.WriteLine("Total time (ms): " + timer.ElapsedMilliseconds);
Console.WriteLine("Done!!! Everything went ok.");
//#if LOCAL
// Console.ReadKey(true);
//#endif
}
private static bool Initialize()
{
// Set ninject values
NinjectConfig.Start();
// TODO: Set automapper values ??
return true;
}
}
}
edited:-
I'm getting following error in azure..
[07/10/2014 15:50:52 > 32a9e0: ERR ] Unhandled Exception: Microsoft.WindowsAzure.Storage.StorageException: The remote server returned an error: (404) Not Found. ---> System.Net.WebException: The remote server returned an error: (404) Not Found.
[07/10/2014 15:50:52 > 32a9e0: ERR ] at System.Net.HttpWebRequest.GetResponse()
[07/10/2014 15:50:52 > 32a9e0: ERR ] at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
[07/10/2014 15:50:52 > 32a9e0: ERR ] --- End of inner exception stack trace ---
[07/10/2014 15:50:52 > 32a9e0: ERR ] at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
[07/10/2014 15:50:52 > 32a9e0: ERR ] at Microsoft.WindowsAzure.Storage.Queue.CloudQueue.UpdateMessage(CloudQueueMessage message, TimeSpan visibilityTimeout, MessageUpdateFields updateFields, QueueRequestOptions options, OperationContext operationContext)
[07/10/2014 15:50:52 > 32a9e0: ERR ] at Microsoft.Azure.Jobs.UpdateQueueMessageVisibilityCommand.TryExecute()
[07/10/2014 15:50:52 > 32a9e0: ERR ] at Microsoft.Azure.Jobs.LinearSpeedupTimerCommand.Execute()
[07/10/2014 15:50:52 > 32a9e0: ERR ] at Microsoft.Azure.Jobs.IntervalSeparationTimer.RunTimer(Object state)
[07/10/2014 15:50:52 > 32a9e0: ERR ] at System.Threading.TimerQueueTimer.CallCallbackInContext(Object state)
[07/10/2014 15:50:52 > 32a9e0: ERR ] at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
[07/10/2014 15:50:52 > 32a9e0: ERR ] at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
[07/10/2014 15:50:52 > 32a9e0: ERR ] at System.Threading.TimerQueueTimer.CallCallback()
[07/10/2014 15:50:52 > 32a9e0: ERR ] at System.Threading.TimerQueueTimer.Fire()
[07/10/2014 15:50:52 > 32a9e0: ERR ] at System.Threading.TimerQueue.FireNextTimers()
[07/10/2014 15:50:52 > 32a9e0: ERR ] at System.Threading.TimerQueue.AppDomainTimerCallback()
[07/10/2014 15:50:52 > 32a9e0: INFO] ..................................................................................................................................................... [07/10/2014 15:50:52 > 32a9e0: SYS ERR ] Job failed due to exit code -532462766
[07/10/2014 15:50:52 > 32a9e0: SYS INFO] Process went down, waiting for 0 seconds
[07/10/2014 15:50:52 > 32a9e0: SYS INFO] Status changed to PendingRestart
[07/10/2014 15:50:57 > 32a9e0: SYS INFO] Run script 'Scheduler.exe' with script host - 'WindowsScriptHost'
[07/10/2014 15:50:57 > 32a9e0: SYS INFO] Status changed to Running
So looking at the error I have tried to executing same function from console app locally and its works.After running multiple time I found that function is executing for 5 minutes exactly.So if there is any time bound to run any function in web jobs.
Thanks.
Upvotes: 3
Views: 3444
Reputation: 3037
I found out the problem can arise if, in a function call, unicode characters are passed to the logger object. Apparently unicode isn't supported and something in the cleanup process goes bad and the job is never closed.
Upvotes: 1
Reputation: 311
There seems to be a bug in the WebJobs sdk where it is not creating the blob containers that should have been created by default, depending on when and where storage accounts are created. What I had to do was create the containers myself.
They are:
azure-jobs-host-output
azure-webjobs-hosts
That resolved the problem for me.
Upvotes: 3