Hello!
I am getting a weird result from running a piece of dbplyr code.
What I have is a column in a table where the values are strings of 11 characters each, with a bunch of leading zeros, something like this:
acct_no
00000001653
00000001666
00000001693
00000001707
00000001735
00000001746
00000001751
00000001557
I call this table with the following function:
tbl(my_conn, in_schema("my_schema", "my_table")) %>%
select(ACCT_NO)
(select here is for demonstration only, I don't want to carry the rest of the cols into the example)
But the result I get looks like this:
# Source: lazy query [?? x 1]
# Database: Microsoft SQL Server 13.00.5081[credentials_credentials_credentials]
ACCT_NO
<chr>
1 0
2 0
3 0
4 0
5 0
6 0
7 0
8 0
9 0
10 0
Furthermore, adding filter(ACCT_NO != "0") seems to be taking no effect at all if put before collect(). But it does work after collect()
It doesn't happen on every table, but it happens on some tables.
On the tables where it doesn't happen, the top values seem to be having fewer leading zeros (not sure if it a coincidence or not)
Has anyone dealt with this before?