it's natural to doubt yourself when learning to code - share your feelings and get moral support here
Julian, a few syntax changes and a little refactoring will get you there. Look at this refactor.
function addTogether(arr){
return arr[0] + arr[1];
}
addTogether([5,7])
// returns 12
But we can do better still. Your function addTogether() only works for an array of length === 2. This isn't very useful. It would be better if we could add the values in an array of any length. There are several ways to do this:
function addAnyArray(arr){
let sum = 0;
for(let i =0; i ++; i < arr.length){
sum += i;
}
return sum;
}
addAnyArray([1,2,3,4]);
// This will return 10
Using ES6, we can get a bit fancier, and terser (but these methods are a little harder to read). Here is an example:
[1,2,3,4].reduce((finalSum, interimValues)=> finalSum + interimValues);
// This will return 10
Hope this helps!
[ELIXIR QUESTION]
Hey Guys I Need Some Help. I wrote a Script for WebScrap
The source :
defmodule Shopee do
@behaviour Crawly.Spider
@doc """
Execute by
Crawly.Engine.start_spider({defmodule_name})
"""
@impl Crawly.Spider
def base_url() do
"https://shopee.com.my"
end
@impl Crawly.Spider
def init() do
[start_urls: ["https://shopee.com.my/search?keyword=computer"]]
end
@impl Crawly.Spider
def parse_item(response) do
urls =
response.body
|> Floki.find(".a._35LNwy")
|> Floki.attribute("href")
requests =
Enum.map(urls, fn url ->
url
|> build_absolute_url(response.request_url)
|> Crawly.Utils.request_from_url()
end)
name =
response.body
|> Floki.find("._1POlWt")
|> Floki.text()
price =
response.body
|> Floki.find("._5W0f35")
|> Floki.text(deep: false, sep: "")
%Crawly.ParsedItem {
:requests => requests,
:items => [
%{name: name, price: price}
]
}
end
def build_absolute_url(url, request_url) do
URI.merge(request_url, url) |> to_string()
end
def show() do
case HTTPoison.get("https://shopee.com.my/search?keyword=computer") do
{:ok, %HTTPoison.Response{status_code: 200, body: body}} ->
urls =
body
|> Floki.find(".a._35LNwy")
|> Floki.attribute("href")
IO.puts urls |> to_string()
{:ok, urls}
end
end
end
The Error I got :
iex(1)> Crawly.Engine.start_spider(Shopee)
01:32:49.699 [error] GenServer #PID<0.408.0> terminating
** (UndefinedFunctionError) function Shopee.init/1 is undefined or private
(shopee 0.1.0) Shopee.init([crawl_id: "8ed539e0-bf11-11eb-86f5-3085a98593d6"])
(crawly 0.13.0) lib/crawly/manager.ex:126: Crawly.Manager.handle_continue/2
(stdlib 3.8) gen_server.erl:637: :gen_server.try_dispatch/4
(stdlib 3.8) gen_server.erl:388: :gen_server.loop/7
........
[ELIXIR QUESTION]
Hey Guys I Need Some Help. I wrote a Script for WebScrap
The source :
defmodule Shopee do @behaviour Crawly.Spider @doc """ Execute by Crawly.Engine.start_spider({defmodule_name}) """ @impl Crawly.Spider def base_url() do "https://shopee.com.my" end @impl Crawly.Spider def init() do [start_urls: ["https://shopee.com.my/search?keyword=computer"]] end @impl Crawly.Spider def parse_item(response) do urls = response.body |> Floki.find(".a._35LNwy") |> Floki.attribute("href") requests = Enum.map(urls, fn url -> url |> build_absolute_url(response.request_url) |> Crawly.Utils.request_from_url() end) name = response.body |> Floki.find("._1POlWt") |> Floki.text() price = response.body |> Floki.find("._5W0f35") |> Floki.text(deep: false, sep: "") %Crawly.ParsedItem { :requests => requests, :items => [ %{name: name, price: price} ] } end def build_absolute_url(url, request_url) do URI.merge(request_url, url) |> to_string() end def show() do case HTTPoison.get("https://shopee.com.my/search?keyword=computer") do {:ok, %HTTPoison.Response{status_code: 200, body: body}} -> urls = body |> Floki.find(".a._35LNwy") |> Floki.attribute("href") IO.puts urls |> to_string() {:ok, urls} end end end
The Error I got :
iex(1)> Crawly.Engine.start_spider(Shopee) 01:32:49.699 [error] GenServer #PID<0.408.0> terminating ** (UndefinedFunctionError) function Shopee.init/1 is undefined or private (shopee 0.1.0) Shopee.init([crawl_id: "8ed539e0-bf11-11eb-86f5-3085a98593d6"]) (crawly 0.13.0) lib/crawly/manager.ex:126: Crawly.Manager.handle_continue/2 (stdlib 3.8) gen_server.erl:637: :gen_server.try_dispatch/4 (stdlib 3.8) gen_server.erl:388: :gen_server.loop/7 ........
IT 's OK , I fixed it.
instead of @behaviour
I declared it use
and
I init the function with an ignore expression in the parametre. So is
@impl Crawly.Spider
def init(_) do
[start_urls: ["https://shopee.com.my/search?keyword=computer"]]
end
instead of
@impl Crawly.Spider
def init() do
[start_urls: ["https://shopee.com.my/search?keyword=computer"]]
end
<Connector port="8010" protocol="HTTP/1.1"
connectionTimeout="20000"
URIEncoding="UTF-8"
address="0.0.0.0"
redirectPort="8443"
maxPostSize="-1"
useIPVHosts="true" />
<Connector SSLEnabled="true" acceptCount="100" clientAuth="false"
disableUploadTimeout="true" enableLookups="false" maxThreads="25"
port="8443" connectionTimeout="20000" maxSwallowSize = "-1" maxHttpHeaderSize="819200"
keystoreFile="....keystore" keystorePass="****"
protocol="org.apache.coyote.http11.Http11NioProtocol" scheme="https"
secure="true" sslProtocol="TLS" compression="on" SSSLVerifyClient="none" />
Opensource Flutter UI library
Hey guys I use cheerio to scrape this table but it return nothing. Did I somehow did wrong ?
<table class="table trackTable">
<thead>
<tr>
<th colspan="4">Consignment No: MY37011088606</th>
</tr>
<tr>
<th>Consignment No</th>
<th>Date & Time</th>
<th>Status</th>
<th>Location</th>
</tr>
</thead>
<tbody>
<tr>
<td>MY37011088606</td>
<td>03/07/2020 14:58:43</td>
<td><b>Delivered</b></td>
<td>Butterworth</td>
</tr>
.....
</table>
and my code was
$ = cheerio.load(response['data']);
$.html();
$('.table#trackTable').each((index, element) => {
if (index === 0) return (true);
console.log(element)
});
Any idea ? is the naming of the selctor wrong?
in the browser i did
$(".table > tbody:nth-child(2)").each(function(i,item) { console.log(item.innerText) });
and it return something , but cheerio doesn't