miércoles, 15 de diciembre de 2021

Rstudio

 https://www.dropbox.com/s/3m8i4g137h6rdzj/RStudio-2021.09.1-372.exe.7z.003?dl=0

 

https://www.dropbox.com/s/moyndu5eieunudb/RStudio-2021.09.1-372.exe.7z.002?dl=0

 

https://www.dropbox.com/s/b2rz1azs87zesk1/RStudio-2021.09.1-372.exe.7z.001?dl=0

 

 

martes, 12 de julio de 2016

Compile tds_fdw in Windows

After many trials and errors this is what I did in order to compile the tds_fdw code in a Windows 7 64 bits enviroment, with Visual Studio 2013

- Installation of dependencies:

- Postgres sources: I have installed Postgres 9.4.5 and downloaded the sources from here and I placed the sources inside the same installation directory.
You'll need this header file: libintl.h, which I got it from here

- Windows SDK, more specifically this library:
C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Lib\x64\WS2_32.Lib

- Freetds, I got it from here:
https://github.com/FreeTDS/freetds
I used last commit

- I used some guidelines from this adaptation of the odbc_fdw project in visual studio to create my own visual studio project

https://github.com/Yarmonov/odbc_fdw/commit/33ee7e0791a3ec00b26c84c2b71ec72ba58ccfcf

I used the debug and release profiles and macros like PGROOT to install the compiled dll in (PostgresDir)\lib

I did put the entire freetds folder in the visual studio project, however, not all the code files are included in the project, there are included only the required ones

These are the steps I made:

A) First of all I did put together the sources of tds_fdw in one folder (the .c and .h files)

B) There is a couple of files that are generated with freetds, but since I didn't compile freetds I added these files mainly from this repository:
http://www.freetds.org/reference/files.html

sysconfdir.h
version.h
encodings.h
num_limits.h
tds_sysdep_public.h
tds_willconvert.h
bkpublic.h
config.h
getaddrinfo.h
poll.h
readpassphrase.h
getaddrinfo.h this is from http://www.netperf.org/svn/netperf2/trunk/src/missing/getaddrinfo.h

I got all these files from repositories, this is the 'walk on the eggshells' part; so far I didn't have any problems with my compilation (and the usage) but it is important to know that these files are not included in the freetds repository and I got them from different sources, so this might affect somewhere; so far in my tests I didn't have any problems

C) The file getaddrinfo.c has this declaration
Line 201: in_addr_t ipaddr;
This in_addr_t is missing, I declared it just above this line
typedef DWORD in_addr_t;

I got that solution from here

The same file has this missed reference 'hostip'
Line 263: strlcpy(hostip, inet_ntoa(sin->sin_addr), hostlen);
I commented out this line; I really cannot find any reference of this variable, so this is another starting point for improvements
In the same file I added the #include "getaddrinfo.h" line to make the compiler find EAI_OVERFLOW

D) I changed the sybdb.h, I added this reference:

#include <inttypes.h>

E) About this bjoern-utf8.h and bjoern-utf8.c files, I added the utf8_table definition located in bjoern-utf8.c into bjoern-utf8.h and excluded from the project the .c file

In debug mode 5 files are generated:
tds_fdw.exp
tds_fdw.lib
tds_fdw.dll
tds_fdw.ilk
tds_fdw.pdb

In other windows installations I noticed is better to compile in Release mode, because in debug mode you need this dll: MSVCR120D.dll
(A similar scenario is described here: http://stackoverflow.com/questions/23114427/vs2012-msvcr120d-dll-is-missing)

There are a couple of custom configurations made in the visual studio project but this is worth mentioning: In the project properties->Configuration Properties->C/C++->Output Files section I did set the Object File Name option as:

$(IntDir)/%(RelativeDir)/

I mean the different .obj files are generated in different folders

If everything is correct you should be able to build the project and the dll will be generated and installed in (PostgresDir)\lib


You can download the visual studio project here

https://www.dropbox.com/s/i759cspy58cq87u/tds_fdw_.rar?dl=0


If you think I'm missing something please let me know

I think that's it, I don't think this is the best solution but I hope this is a starting point for a better implementation by someone else :)

Last but not the least, sorry for any grammar o spelling mistakes :) I hear constructive criticism with my writing as well

martes, 26 de agosto de 2014

Autenticacion con clave publica en SSH

Cada vez que quería parar o levantar el cluster me pedía la contraseña del usuario (en mi caso hadoop) lo que hice fue asegurarme que tengo permisos a la carpeta ssh

chmod 700 ~/.ssh/
chmod 600 ~/.ssh/*
Y agregar la clave (que ya habia generado)

ssh-add

Luego pude autenticarme a ssh localhost sin password :D

Fuente

Error: El datanode no se levanta

Al utilizar el comando jps no aparecía el datanode, revisando este log:

tail -n 20 /usr/local/hadoop/logs/hadoop-hadoop-datanode-hadoop-VirtualBox.log

 Me aparecia el siguiente error

STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common -r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z

STARTUP_MSG:   java = 1.7.0_55

************************************************************/

2014-08-24 19:26:09,907 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]

2014-08-24 19:26:14,307 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /home/hadoop/mydata/hdfs/datanode : 

EPERM: Operation not permitted

 at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method)

 at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:158)

 at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:635)

 at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)

 at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:130)

 at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:146)

 at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:1698)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.getDataDirsFromURIs(DataNode.java:1745)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1722)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1642)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858)

2014-08-24 19:26:14,366 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain

java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/home/hadoop/mydata/hdfs/datanode" 

 at org.apache.hadoop.hdfs.server.datanode.DataNode.getDataDirsFromURIs(DataNode.java:1754)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1722)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1642)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837)

 at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858)

2014-08-24 19:26:14,403 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1

2014-08-24 19:26:14,489 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 




Solucione el problema con este post, basicamente:

1.) Parar el cluster: stop-all.sh
2.) Cambiar permisos de la carpeta donde apunta la variable dfs.datanode.data.dir (especificado en hdfs-site.xml)
sudo chown hduser:hadoop -R /home/hadoop/mydata/hdfs/
sudo chmod 777 -R
3.) Formatear el namenode y arrancar de nuevo
hadoop namenode -format
start-all.sh