I am mostly complaining about his writing style. Obviously the subject itself is interesting (to some people)
I am mostly complaining about his writing style. Obviously the subject itself is interesting (to some people)
I wouldn’t bet my eye on it, but who knows!.. Maybe he was a better teacher before!
Again, Knuth himself said in a preface that Volumes 2 through 5 are independent.
That sounds interesting, will take a look. I am not against theoretical computer science, i just think Knuth doesn’t reads like a good teacher…
Because volume 1 is not available in the library
Edit: but also the volumes aren’t not dependent on each other. They treat very different topics, i doubt reading Volume 1 will help with Volume 4.
I feel offended by you somehow equalizing perl and lisp
This a much better done meme
The other one before makes zero sense
2100 parameters is a documented ODBC limitation( which applies on all statements in a batch)
This means that a
“insert into (c1, c2) values (?,?), (?,?)…” can only have 2100 bound parameters, and has nothing to do with code, and even less that surrounding code is “spaghetti”
The tables ARE normalised, the fact that there are 50 colums is because underlying market - data calibration functions expects dozens of parameters, and returns back dozens of other results, such as volatility, implied durations, forward duration and more
The amount of immaturity, inexperience, and ignorance coming from 2 people here is astounding
Blocked
You should take a break from trolling
I timed the transaction and opening of the connection, it takes maybe a 100 milliseconds, absolutely doesn’t explain ghe abysmal performance
Transaction is needed because 2 tables are touched, i don’t want to deal with partially inserted data
Cannot share the code, but it’s python calling .NET through “clr”, and using SqlBulkCopy
What do you suggest i shouldn’t be using that? It’s either a prepared query, with thousands of parameters, or a plain text string with parameters inside (which admittedly, i didn’t try, might be faster lol)
Will try bcp & report back EDIT: I can’t install bcp because it is only distributed with SQLServer itself, and I cannot install it on my corporate laptop.
I will try bcp. Somehow, i was convinced I had to have access to the machine running the sql server to use it, but from the doca i see i can specify a remote host… Will report back! EDIT: I can’t install bcp because it is only distributed with SQLServer itself, and I cannot install it on my corporate laptop.
Please enlighten us? You barely know anything about the system or usage, and you have deduced nosql is better? Lol
I am using SqlBulkInsert, given how bad MS is with naming things, that might as well be row inserts instead of bulks
Oh buddy, enjoy your life & don’t touch Microsoft even with a 10 meters stick
I would not recommend using parquet instead of csv. Indeed, parquet is a type of wooden flooring, while csv is a human readable file format. As you can see, it is not wise to replace one with the other. Don’t hesitate about asking more questions regarding your home design!
So they posted that screenshot before even trying to run it on some useless file to see it works… Internet points are surely a drug
React +python + postgres/sqlite
Your last question is equivalent to : why there so many math theories? Can’t we just reuse the old ones?
New language appear as a natural product from research in type theory for ex
You re not stupid, python’s packaging & versionning is PITA. as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem