Reputation: 13
I have an Excel sheet that contains 35,000 URL for products images. I'm looking for a way to download images from url's after that I can insert them to SQL Server as images. I have no idea about that I need to help.
Any ideas?
Upvotes: 0
Views: 435
Reputation: 754368
I would try to do something like this in a standalone program - e.g. a command-line utility or something. I coded this in C# and for whatever reason, the online C#-to-VB.NET converters all barfed up and couldn't convert this - I hope you get the basic idea and you can do this in VB.NET yourself.
First step: get the ExcelDataReader to read Excel files.
Then do something like this:
// define list of URLs
List<string> imageUrls = new List<string>();
// open Excel file and read in the URLs into a list of strings
string filePath = @"C:\YourUrlDataFile.xlsx"; // adapt to YOUR needs!
// using a "FileStream" and the "ExcelDataReader", read all the URL's
// into a list of strings
using (FileStream stream = File.Open(filePath, FileMode.Open, FileAccess.Read))
{
using (IExcelDataReader excelReader = ExcelReaderFactory.CreateOpenXmlReader(stream))
{
while (excelReader.Read())
{
string url = excelReader.GetString(0);
imageUrls.Add(url);
}
excelReader.Close();
}
}
// set up the necessary infrastructure for storing into SQL Server
// the query needs to be *ADAPTED* to your own situation - use *YOUR*
// table and column name!
string query = "INSERT INTO dbo.TestImages(ImageData) VALUES(@Image);";
// get the connection string from config - again: *ADAPT* to your situation!
string connectionString = ConfigurationManager.ConnectionStrings["YourDatabase"].ConnectionString;
// use SqlConnection and SqlCommand in using blocks
using(SqlConnection conn = new SqlConnection(connectionString))
using (SqlCommand cmd = new SqlCommand(query, conn))
{
// add parameter to SQL query
cmd.Parameters.Add("@Image", SqlDbType.VarBinary, -1);
// loop through the URL's - try to fetch the image,
// and if successful, insert into SQL Server database
foreach (string url in imageUrls)
{
try
{
// get a new "WebClient", and fetch the data from the URL
WebClient client = new WebClient();
byte[] imageData = client.DownloadData(url);
// open connection
conn.Open();
// set the parameter to the data fetched from the URL
cmd.Parameters["@Image"].Value = imageData;
// execute SQL query - the return value is the number
// of rows inserted - should be *1* (if successful)
int inserted = cmd.ExecuteNonQuery();
// close connection
conn.Close();
}
catch (Exception exc)
{
// Log the exception
}
}
}
This should do pretty much what you need - of course, there are plenty of additional things you could do - read only a certain number of URL's from the Excel file, add more logging (also for success cases etc.) - but that should be the rough skeleton of this little program
Upvotes: 1