Quantcast
Channel: Active questions tagged python - Stack Overflow
Viewing all articles
Browse latest Browse all 23160

Empty columns while scraping data with Selenium

$
0
0

I'm trying to scrape data with Selenium. Since there was a problem with simultaneous scrolling, I wanted to perform data scraping by enlarging the div with scripts. However, this time I saw that there were too many empty columns in my csv file. However, I had also assigned to fields that could not be found with NoSuchElementExpections. My code is like this:

driver.execute_script("document.body.style.zoom='50%'")time.sleep(5)  # Sayfanın yüklenmesini beklemek için zaman tanımaplayers_data = []a_elements = driver.find_elements(By.XPATH, "//a[contains(@class, 'sc-3937c22d-0') and contains(@class, 'jrbLdB')] | //span[contains(@class, 'sc-gFqAkR cLzxjv') and contains(text(), 'Out')]/ancestor::a")for a_element in a_elements:    try:        wait.until(EC.element_to_be_clickable(a_element))        a_element.click()        driver.execute_script(script, a_element)    except ElementClickInterceptedException:        driver.execute_script("arguments[0].click();", a_element)    time.sleep(3)  # Sayfanın yüklenmesini beklemek için    scroll_div = driver.find_element(By.XPATH, "//div[contains(@class, 'sc-fqkvVR') and contains(@class, 'sc-dcJsrY') and contains(@class, 'jHDYJH') and contains(@class, 'llKrgf')]")    script = """    arguments[0].style.height = '1500px';  // Yükseklik maksimum değer    arguments[0].style.width = '500px'    arguments[0].style.maxHeight = 'none';  // Genişlik ekranın tamamını kaplayacak şekilde    arguments[0].style.overflow = 'visible';"""    driver.execute_script(script, scroll_div)    time.sleep(3)    WebDriverWait(driver, 10).until(EC.presence_of_element_located((By.XPATH, "//div[contains(@class, 'sc-gFqAkR') and contains(@class, 'kNXPSQ')]")))    try:           name = driver.find_element(By.XPATH, "//div[contains(@class, 'sc-gFqAkR') and contains(@class, 'kNXPSQ')]").text    except NoSuchElementException:        name = "No Name"    try:    players_data.append({"Name": name,    })        svg_element = driver.find_element(By.XPATH, "//div[@class='sc-fqkvVR sc-dcJsrY jHDYJH llKrgf']//div[@class='sc-fqkvVR kdqiyG']/button[@class='sc-aXZVg iYqhhm'][2]")    driver.execute_script("arguments[0].click();", svg_element)    time.sleep(3)  # İşlemin tamamlanmasını beklemek içindriver.quit()df_players = pd.DataFrame(players_data)# DataFrame'i CSV dosyasına kaydetmecsv_file_path = 'bjkscrap/bjk-antalya3.csv' #Kaydetmek istenilen dizin, dosya adı ve uzantısıdf_players.to_csv(csv_file_path, index=False, encoding='utf-8')

I want to export the data to csv file. However, my name columns are empty. I want to scrape the data on the pages that open when an element is clicked on the given page.


Viewing all articles
Browse latest Browse all 23160

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>