ZedZip
ZedZip

Reputation: 6456

MS Text Analytics Cognitive Service: how to work with local database?

Microsoft provides a service to analyze text data called Text Analytics Cognitive Service.

Is it possible to use this service with local database? i.e. not in Azure

I work with some large databases and as for me it can be interesting to use it for: Language detection Key phrase extraction Named Entity recognition Sentiment analysis

Upvotes: 1

Views: 65

Answers (1)

Md Farid Uddin Kiron
Md Farid Uddin Kiron

Reputation: 22495

Once you pull your data that you would like to detect its language from your local database, you just need to fetch it then just pass in below method. It would analysis your value in response.

API Access Keys:

        private static readonly string endpointKey = "YourEndPonitKey";
        private static readonly string endpoint = "https://YourServiceURL.cognitiveservices.azure.com/text/analytics/v2.1/languages";

Code Snippet:

    public async Task<object> DetectLanguageAsync(string InputFromDbOrUser)
    {
        try
        {
            DetectedLanguageResponseModel ObjDitectedLanguageResponse = new DetectedLanguageResponseModel();
            //Create laguage detection request param
            RequestModel objRequestModel = new RequestModel();
            objRequestModel.id = "1";
            objRequestModel.text = InputFromDbOrUser;

            //Made Document List
            List<RequestModel> Objdocuments = new List<RequestModel>();
            Objdocuments.Add(objRequestModel);
            //Bind Request Model
            LanguageDetection objRequestList = new LanguageDetection();
            objRequestList.documents = Objdocuments;

            // Bind and Serialize Request Object 
            var serializedObject = JsonConvert.SerializeObject(objRequestList);

            // Call Language Detection API   
            using (var client = new HttpClient())
            using (var request = new HttpRequestMessage())
            {
                request.Method = HttpMethod.Post;
                request.RequestUri = new Uri(endpoint);
                request.Content = new StringContent(serializedObject, Encoding.UTF8, "application/json");
                request.Headers.Add("Ocp-Apim-Subscription-Key", endpointKey);

                var response = await client.SendAsync(request);

                //Check status code and retrive response

                if (response.IsSuccessStatusCode)
                {

                    ResponseModel objResponse = JsonConvert.DeserializeObject<ResponseModel>(await response.Content.ReadAsStringAsync());
                    //Check Response List
                    foreach (var item in objResponse.documents)
                    {

                        //Checkings Empty Response and Return to Caller
                        if (objResponse.documents != null)
                        {
                            ObjDitectedLanguageResponse.Language = objResponse.documents[0].detectedLanguages[0].name;
                            return ObjDitectedLanguageResponse;
                        }
                        else
                        {
                            return "Sorry, I am not able to find a related topic! Would you like me to Bing Search?";
                        }

                    }


                }
                else
                {
                    var result_string = await response.Content.ReadAsStringAsync();
                    return result_string;
                }
            }
            return ObjDitectedLanguageResponse;
        }
        catch (Exception ex)
        {
            throw new NotImplementedException(ex.Message, ex.InnerException);
        }

    }

Class Used:

public class DetectedLanguage
    {
        public string name { get; set; }
        public string iso6391Name { get; set; }
    }

    public class DetectedLanguageResponseModel
    {
        public dynamic Language { get; set; }
    }

    public class LanguageDetection
    {
        public List<RequestModel> documents { get; set; }
    }

    public class RequestModel
    {
        public string id { get; set; }
        public string text { get; set; }
    }
    public class ResponseDocument
    {
        public string id { get; set; }
        public List<DetectedLanguage> detectedLanguages { get; set; }
    }
    public class ResponseModel
    {
        public List<ResponseDocument> documents { get; set; }
        public List<object> errors { get; set; }
    }

Note: The current limit is 5,120 characters for each document; if you need to analyze larger documents, you can break them up into smaller chunks for more you could refer official document

Hope that would help. If you need more implementation assistance please have a look on here

Upvotes: 1

Related Questions