Reputation: 23
Reading through this problem in a book
Given a string that contains both letters and numbers, write a function that pulls out all the numbers then returns their sum. Sample input and output
The string “a1b2c3” should return 6 (1 + 2 + 3). The string “a10b20c30” should return 60 (10 + 20 + 30). The string “h8ers” should return “8”.
My solution so far is
import Foundation
func sumOfNumbers(in string: String) -> Int {
var numbers = string.filter { $0.isNumber }
var numbersArray = [Int]()
for number in numbers {
numbersArray.append(Int(number)!)
}
return numbersArray.reduce(0, { $0 * $1 })
}
However, I get the error
Solution.swift:8:33: error: cannot convert value of type 'String.Element' (aka 'Character') to expected argument type 'String'
numbersArray.append(Int(number)!)
^
And I'm struggling to get this number
of type String.Element
into a Character
. Any guidance would be appreciated.
Upvotes: 0
Views: 592
Reputation: 271355
The error occurs because Int.init
is expecting a String
, but the argument number
you gave is of type Character
.
It is easy to fix the compiler error just by converting the Character
to String
by doing:
numbersArray.append(Int("\(number)")!)
or just:
numbersArray.append(number.wholeNumberValue!)
However, this does not produce the expected output. First, you are multiplying the numbers together, not adding. Second, you are considering each character separately, and not considering groups of digits as one number.
You can instead implement the function like this:
func sumOfNumbers(in string: String) -> Int {
string.components(separatedBy: CharacterSet(charactersIn: "0"..."9").inverted)
.compactMap(Int.init)
.reduce(0, +)
}
The key thing is to split the string using "non-digits", so that "10" and "20" etc gets treated as individual numbers.
Upvotes: 1