Reputation: 126
I am attempting to setup our TFS2015 instance to use Release manager to build and deploy. We have our TFS environment setup as such:
We installed the Release Manager agent on TfsBuidAgent configured to run as a service. We have one agent each for build and release. Our build works fine. Our deployment fails instantly. The log for the release is empty and indicates only that the release failed. On TfsBuildAgent I located the log for the agent we installed and it has this exception in it:
23:46:15.012149 Sending trace output to log files: C:\Install Files\agent\_diag
23:46:15.012149 vsoWorker.exe was run with the following command line:
"C:\Install Files\agent\agent\worker\vsoWorker.exe" /name:Worker-64eb077e-efad-4092-a396-ae8a3854583c /id:64eb077e-efad-4092-a396-ae8a3854583c /rootFolder:"C:\Install Files\agent" /logger:Forwarding,1.0.0;Verbosity=Diagnostic,Name=Agent2-965ab608b756d5140aba25dadafb30d7;JobId=64eb077e-efad-4092-a396-ae8a3854583c
23:46:15.012149 VsoWorker.Main(): Create AgentLogger
23:46:15.012149 VsoWorker.Main(): Parse command line
23:46:15.027765 VsoWorker.Main(): Setup Agent
23:46:15.027765 VsoWorker.LoadSettings()
23:46:15.121514 SettingsFileHelper.Load - settings[AutoUpdate]=True
23:46:15.121514 SettingsFileHelper.Load - settings[RootFolder]=C:\Install Files\agent
23:46:15.121514 SettingsFileHelper.Load - settings[WorkFolder]=E:\Agent
23:46:15.121514 SettingsFileHelper.Load - settings[ServerUrl]=http://[OurTfsServer]/tfs
23:46:15.121514 SettingsFileHelper.Load - settings[AgentName]=Agent-[OurBuildAgent]
23:46:15.121514 SettingsFileHelper.Load - settings[PoolId]=2
23:46:15.121514 SettingsFileHelper.Load - settings[PoolName]=Dev Build Pool
23:46:15.121514 SettingsFileHelper.Load - settings[AgentId]=9
23:46:15.121514 SettingsFileHelper.Load - settings[RunAsWindowsService]=True
23:46:15.121514 SettingsFileHelper.Load - settings[WindowsServiceName]=vsoagent.[OurTfsServer].Agent-[OurBuildAgent]
23:46:15.121514 SettingsFileHelper.Load - settings[WindowsServiceDisplayName]=VSO Agent ([OurTfsServer]Agent-[OurBuildAgent])
23:46:15.121514 AgentWorkerIPCListener(jobId = 64eb077e-efad-4092-a396-ae8a3854583c)
23:46:15.121514 AgentWorkerIPCListener.Pipename = net.pipe://localhost/64eb077e-efad-4092-a396-ae8a3854583c
23:46:15.137140 VsoWorker.Main(): Run Agent
23:46:15.184015 AgentWorkerIPCListener.Listen() - opening host
23:46:15.355902 AgentWorkerIPCListener.Listen() - waiting for shutdown
23:46:15.480890 AgentWorkerIPCService.Connect
23:46:15.496515 AgentWorkerIPCService.StartJob - firing StartJob event
23:46:15.496515 AgentWorkerIPCService.Disconnect
23:46:16.246515 ForwardingWriter.Initialize(6)
23:46:16.246515 ForwardingWriter.Initialize() - namedpipe = net.pipe://localhost/Agent2-965ab608b756d5140aba25dadafb30d7
23:46:16.371513 System.ServiceModel.EndpointNotFoundException: There was no endpoint listening at net.pipe://localhost/Agent2-965ab608b756d5140aba25dadafb30d7 that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details. ---> System.IO.PipeException: The pipe endpoint 'net.pipe://localhost/Agent2-965ab608b756d5140aba25dadafb30d7' could not be found on your local machine.
--- End of inner exception stack trace ---
Server stack trace:
at System.ServiceModel.Channels.PipeConnectionInitiator.GetPipeName(Uri uri, IPipeTransportFactorySettings transportFactorySettings)
at System.ServiceModel.Channels.NamedPipeConnectionPoolRegistry.NamedPipeConnectionPool.GetPoolKey(EndpointAddress address, Uri via)
at System.ServiceModel.Channels.CommunicationPool`2.TakeConnection(EndpointAddress address, Uri via, TimeSpan timeout, TKey& key)
at System.ServiceModel.Channels.ConnectionPoolHelper.EstablishConnection(TimeSpan timeout)
at System.ServiceModel.Channels.ClientFramingDuplexSessionChannel.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.OnOpen(TimeSpan timeout)
at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.CallOpenOnce.System.ServiceModel.Channels.ServiceChannel.ICallOnce.Call(ServiceChannel channel, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.CallOnceManager.CallOnce(TimeSpan timeout, CallOnceManager cascade)
at System.ServiceModel.Channels.ServiceChannel.EnsureOpened(TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)
Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at Microsoft.TeamFoundation.DistributedTask.Agent.Logger.INetPipeWriter.Connect(Guid jobId)
at Microsoft.TeamFoundation.DistributedTask.Agent.Logger.ForwardingWriter.Initialize(Dictionary`2 writerParameters)
at Microsoft.TeamFoundation.DistributedTask.Agent.Worker.Common.LoggerHelper.LoadLogWriter(String rootFolder, String typeName, String version, Dictionary`2 writerParameters)
at Microsoft.TeamFoundation.DistributedTask.Worker.Worker.Run()
23:46:16.371513 AgentWorkerIPCListener.Listen() - closing
23:46:16.418389 AgentWorkerIPCListener.Listen() - closed
23:46:16.418389 BaseLogger.Dispose()
Why am I getting a localhost error for something that should be communicating across multiple servers? Is this configured somewhere that I can change it? (I have looked and can not find any configs that would do this.) I suspect the pipe it is trying to find is for the TFS logging service based on our research here but I also can not find wherever that's at.
Upvotes: 1
Views: 186
Reputation: 126
Upon further review the problem was that the Agent we had installed for releases was either corrupted, or we had made some error in configuration. Reinstalling the agent solved our problem and allowed the release to progress to completing the actual steps of the release. As a troubleshooting step we isolated the problem to the agent by switching the release to use an agent we knew was in good working order as it had been handling the build portion of the process.
Upvotes: 1